Mar 09 15:57:51 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 15:57:51 crc restorecon[4675]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 15:57:52 crc restorecon[4675]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 15:57:53 crc kubenswrapper[4831]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.317100 4831 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327767 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327822 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327836 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327847 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327863 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327879 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327891 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327901 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327912 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327922 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327944 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327964 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.327983 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328002 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328012 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328022 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328032 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328042 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328052 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328062 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328071 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328082 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328094 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328103 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328110 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328118 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328136 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328148 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328161 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328169 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328179 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328188 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328197 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328207 4831 feature_gate.go:330] unrecognized feature gate: Example Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328215 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328224 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328232 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328242 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328252 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328262 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328271 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328281 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328290 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328300 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328313 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328325 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328335 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328345 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328356 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328364 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328371 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328379 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328387 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328434 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328442 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328450 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328457 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328465 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328476 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328486 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328494 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328503 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328511 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328520 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328532 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328540 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328549 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328557 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328568 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328578 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.328587 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.330995 4831 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331031 4831 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331066 4831 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331086 4831 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331103 4831 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331116 4831 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331133 4831 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331146 4831 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331156 4831 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331166 4831 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331176 4831 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331191 4831 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331201 4831 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331211 4831 flags.go:64] FLAG: --cgroup-root="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331220 4831 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331230 4831 flags.go:64] FLAG: --client-ca-file="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331239 4831 flags.go:64] FLAG: --cloud-config="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331248 4831 flags.go:64] FLAG: --cloud-provider="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331257 4831 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331270 4831 flags.go:64] FLAG: --cluster-domain="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331279 4831 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331288 4831 flags.go:64] FLAG: --config-dir="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331297 4831 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331307 4831 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331319 4831 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331328 4831 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331338 4831 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331347 4831 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331357 4831 flags.go:64] FLAG: --contention-profiling="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331366 4831 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331375 4831 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331384 4831 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331422 4831 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331468 4831 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331480 4831 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331491 4831 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331511 4831 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331531 4831 flags.go:64] FLAG: --enable-server="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331543 4831 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331562 4831 flags.go:64] FLAG: --event-burst="100" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331574 4831 flags.go:64] FLAG: --event-qps="50" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331585 4831 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331598 4831 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331610 4831 flags.go:64] FLAG: --eviction-hard="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331624 4831 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331633 4831 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331643 4831 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331657 4831 flags.go:64] FLAG: --eviction-soft="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331666 4831 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331675 4831 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331684 4831 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331693 4831 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331702 4831 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331711 4831 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331720 4831 flags.go:64] FLAG: --feature-gates="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331732 4831 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331741 4831 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331751 4831 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331762 4831 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331772 4831 flags.go:64] FLAG: --healthz-port="10248" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331782 4831 flags.go:64] FLAG: --help="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331791 4831 flags.go:64] FLAG: --hostname-override="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331800 4831 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331824 4831 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331834 4831 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331843 4831 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331852 4831 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331861 4831 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331870 4831 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331879 4831 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331888 4831 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331897 4831 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331907 4831 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331916 4831 flags.go:64] FLAG: --kube-reserved="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331926 4831 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331937 4831 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331949 4831 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331960 4831 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331971 4831 flags.go:64] FLAG: --lock-file="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331982 4831 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.331993 4831 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332005 4831 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332035 4831 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332046 4831 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332055 4831 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332064 4831 flags.go:64] FLAG: --logging-format="text" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332073 4831 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332083 4831 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332092 4831 flags.go:64] FLAG: --manifest-url="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332101 4831 flags.go:64] FLAG: --manifest-url-header="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332115 4831 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332124 4831 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332136 4831 flags.go:64] FLAG: --max-pods="110" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332145 4831 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332157 4831 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332174 4831 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332197 4831 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332211 4831 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332223 4831 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332235 4831 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332270 4831 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332284 4831 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332296 4831 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332308 4831 flags.go:64] FLAG: --pod-cidr="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332320 4831 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332337 4831 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332348 4831 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332360 4831 flags.go:64] FLAG: --pods-per-core="0" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332370 4831 flags.go:64] FLAG: --port="10250" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332382 4831 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332430 4831 flags.go:64] FLAG: --provider-id="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332444 4831 flags.go:64] FLAG: --qos-reserved="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332457 4831 flags.go:64] FLAG: --read-only-port="10255" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332469 4831 flags.go:64] FLAG: --register-node="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332480 4831 flags.go:64] FLAG: --register-schedulable="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332492 4831 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332514 4831 flags.go:64] FLAG: --registry-burst="10" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332526 4831 flags.go:64] FLAG: --registry-qps="5" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332537 4831 flags.go:64] FLAG: --reserved-cpus="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332555 4831 flags.go:64] FLAG: --reserved-memory="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332571 4831 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332582 4831 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332595 4831 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332618 4831 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332630 4831 flags.go:64] FLAG: --runonce="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332641 4831 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332652 4831 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332665 4831 flags.go:64] FLAG: --seccomp-default="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332677 4831 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332688 4831 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332699 4831 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332711 4831 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332722 4831 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332734 4831 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332745 4831 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332755 4831 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332766 4831 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332778 4831 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332790 4831 flags.go:64] FLAG: --system-cgroups="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332801 4831 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332822 4831 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332833 4831 flags.go:64] FLAG: --tls-cert-file="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.332844 4831 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333041 4831 flags.go:64] FLAG: --tls-min-version="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333052 4831 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333063 4831 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333074 4831 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333085 4831 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333097 4831 flags.go:64] FLAG: --v="2" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333112 4831 flags.go:64] FLAG: --version="false" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333127 4831 flags.go:64] FLAG: --vmodule="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333141 4831 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.333153 4831 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333491 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333509 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333525 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333536 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333546 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333557 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333568 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333579 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333592 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333606 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333618 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333629 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333644 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333659 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333670 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333680 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333693 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333703 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333714 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333726 4831 feature_gate.go:330] unrecognized feature gate: Example Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333737 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333749 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333760 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333771 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333781 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333792 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333803 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333813 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333823 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333834 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333843 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333854 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333864 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333873 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333887 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333899 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333909 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333919 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333931 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333941 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333953 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333964 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333974 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333985 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.333994 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334005 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334015 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334025 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334035 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334044 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334053 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334062 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334071 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334082 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334091 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334101 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334111 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334122 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334132 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334146 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334159 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334169 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334180 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334190 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334200 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334210 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334224 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334233 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334247 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334260 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.334271 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.334302 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.348580 4831 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.348631 4831 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348766 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348779 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348789 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348797 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348807 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348816 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348824 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348832 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348841 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348849 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348857 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348869 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348883 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348895 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348904 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348913 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348921 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348930 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348938 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348947 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348956 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348965 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348976 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348986 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.348995 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349004 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349013 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349021 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349030 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349040 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349048 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349057 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349064 4831 feature_gate.go:330] unrecognized feature gate: Example Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349072 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349080 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349088 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349096 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349104 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349112 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349120 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349127 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349135 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349144 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349151 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349159 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349167 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349177 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349187 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349195 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349204 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349212 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349220 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349229 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349237 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349245 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349254 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349262 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349270 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349278 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349285 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349293 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349301 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349309 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349316 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349324 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349334 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349342 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349351 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349359 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349366 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349374 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.349388 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349708 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349721 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349730 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349739 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349747 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349755 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349763 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349771 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349779 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349787 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349796 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349804 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349813 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349823 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349832 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349841 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349850 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349858 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349866 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349874 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349882 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349891 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349899 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349907 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349915 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349923 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349932 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349940 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349948 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349965 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349974 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349983 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.349994 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350005 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350013 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350023 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350032 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350040 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350048 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350056 4831 feature_gate.go:330] unrecognized feature gate: Example Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350065 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350074 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350082 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350090 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350100 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350110 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350121 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350131 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350139 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350148 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350156 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350163 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350172 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350179 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350187 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350194 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350202 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350210 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350220 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350231 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350240 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350249 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350259 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350269 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350277 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350287 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350296 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350304 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350313 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350321 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.350329 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.350341 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.352231 4831 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.357698 4831 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.362581 4831 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.362734 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.365534 4831 server.go:997] "Starting client certificate rotation" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.365599 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.365760 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.392817 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.395386 4831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.398445 4831 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.416140 4831 log.go:25] "Validated CRI v1 runtime API" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.456628 4831 log.go:25] "Validated CRI v1 image API" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.458876 4831 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.466604 4831 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-15-54-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.466666 4831 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.499048 4831 manager.go:217] Machine: {Timestamp:2026-03-09 15:57:53.493282891 +0000 UTC m=+0.626965374 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7d9f8d15-aabe-48b4-8d2f-58416afd8526 BootID:7f9fb4aa-30fc-49bf-b554-e009613f58b0 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:de:2f:a3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:de:2f:a3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:37:24:75 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3d:7e:6b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ae:71:ea Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a1:26:40 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:f4:6c:a4:b7:31 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:38:1b:87:c3:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.499516 4831 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.499739 4831 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.501239 4831 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.501568 4831 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.501627 4831 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.501969 4831 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.501989 4831 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.502935 4831 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.502991 4831 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.503870 4831 state_mem.go:36] "Initialized new in-memory state store" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.504066 4831 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.508506 4831 kubelet.go:418] "Attempting to sync node with API server" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.508541 4831 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.508581 4831 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.508603 4831 kubelet.go:324] "Adding apiserver pod source" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.508620 4831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.514451 4831 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.516330 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.517265 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.517268 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.517380 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.517471 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.519254 4831 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521821 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521868 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521884 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521899 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521921 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521934 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521948 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.521971 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.522021 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.522036 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.522069 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.522082 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.524441 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.525228 4831 server.go:1280] "Started kubelet" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.527821 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:53 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.528163 4831 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.527612 4831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.529517 4831 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.534533 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.534611 4831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.536726 4831 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.536764 4831 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.537019 4831 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.538671 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.536992 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.540927 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.540996 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.541339 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.542508 4831 factory.go:55] Registering systemd factory Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.542557 4831 factory.go:221] Registration of the systemd container factory successfully Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.547059 4831 server.go:460] "Adding debug handlers to kubelet server" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.549457 4831 factory.go:153] Registering CRI-O factory Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.549497 4831 factory.go:221] Registration of the crio container factory successfully Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.549598 4831 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.549638 4831 factory.go:103] Registering Raw factory Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.549665 4831 manager.go:1196] Started watching for new ooms in manager Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.551018 4831 manager.go:319] Starting recovery of all containers Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557720 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557850 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557881 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557908 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557934 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557960 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.557987 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558014 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558043 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558073 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558097 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558122 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558148 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558178 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558201 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558303 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558330 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558352 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558442 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558471 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558497 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558525 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558551 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558579 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558604 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558633 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558665 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558694 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558720 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558746 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558771 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558799 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558825 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558854 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558883 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558909 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558934 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558957 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.558983 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559007 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559031 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559058 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559087 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559112 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559138 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559161 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559186 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559213 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559239 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559263 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559288 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559314 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559348 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559379 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559442 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559472 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559500 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559528 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559553 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559580 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559606 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559635 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559660 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559686 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559714 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559741 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559766 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559796 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559824 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559849 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559873 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559897 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559921 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.559955 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560015 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560044 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560069 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560095 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560119 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560145 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560170 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560198 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560224 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560251 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560276 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560300 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560325 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560349 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560374 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560462 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560493 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560519 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560544 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560570 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560592 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560617 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560647 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560674 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560700 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560724 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560751 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560774 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560797 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560820 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560856 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560890 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560931 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560958 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.560987 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.561061 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565025 4831 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565071 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565112 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565135 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565156 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565176 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565193 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565211 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565231 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565250 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565269 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565288 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565308 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565346 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565385 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565447 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565478 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565504 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565552 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565572 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565601 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565626 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565650 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565673 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565690 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565715 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565741 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565769 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565793 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565815 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565890 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565915 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.565978 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566002 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566025 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566054 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566078 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566100 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566127 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566150 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566173 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566196 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566219 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566244 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566300 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566326 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566351 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566376 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566438 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566465 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566488 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566511 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566530 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566548 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566566 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566584 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566601 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566619 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566638 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566656 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566674 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566690 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566708 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566726 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566746 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566763 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566782 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566800 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566818 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566836 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566861 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566880 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566898 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566915 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566936 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566955 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566973 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.566991 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567010 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567028 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567046 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567064 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567082 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567100 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567119 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567137 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567155 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567172 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567189 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567208 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567226 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567247 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567265 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567282 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567299 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567320 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567337 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567355 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567372 4831 reconstruct.go:97] "Volume reconstruction finished" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.567385 4831 reconciler.go:26] "Reconciler: start to sync state" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.589287 4831 manager.go:324] Recovery completed Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.602179 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.604044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.604103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.604121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.605341 4831 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.605386 4831 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.605454 4831 state_mem.go:36] "Initialized new in-memory state store" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.611901 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.615809 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.615966 4831 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.616080 4831 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.616243 4831 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 15:57:53 crc kubenswrapper[4831]: W0309 15:57:53.619198 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.619312 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.624781 4831 policy_none.go:49] "None policy: Start" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.626533 4831 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.626585 4831 state_mem.go:35] "Initializing new in-memory state store" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.639185 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.686745 4831 manager.go:334] "Starting Device Plugin manager" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.686815 4831 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.686835 4831 server.go:79] "Starting device plugin registration server" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.687452 4831 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.687482 4831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.687692 4831 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.687799 4831 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.687821 4831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.698188 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.717123 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.717197 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.718364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.718419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.718430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.718584 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719144 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719666 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719905 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.719959 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720497 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720749 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.720979 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721578 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721644 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721668 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.721581 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.722414 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.722447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.722459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.722975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723268 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.723737 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.724566 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.724610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.724629 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.742366 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769427 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769488 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769510 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769573 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769611 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769672 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769701 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.769777 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.787997 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.789390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.789461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.789477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.789499 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.789948 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.870988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871205 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871215 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871255 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871313 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871320 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871371 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871385 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871357 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871455 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871505 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871466 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871660 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871766 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.871992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.990432 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.991924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.991971 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.992019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:53 crc kubenswrapper[4831]: I0309 15:57:53.992051 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:57:53 crc kubenswrapper[4831]: E0309 15:57:53.992677 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.071105 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.080526 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.101805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.121232 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.123607 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-197c69f39ca2d6a488fe0bdca98108347f3ce65ef3e55a0ff9fdace75cd88f0c WatchSource:0}: Error finding container 197c69f39ca2d6a488fe0bdca98108347f3ce65ef3e55a0ff9fdace75cd88f0c: Status 404 returned error can't find the container with id 197c69f39ca2d6a488fe0bdca98108347f3ce65ef3e55a0ff9fdace75cd88f0c Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.124809 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ccfb0f1d242dbc6599715ff160b98107a867f4629af39bc14b23b4e536709d39 WatchSource:0}: Error finding container ccfb0f1d242dbc6599715ff160b98107a867f4629af39bc14b23b4e536709d39: Status 404 returned error can't find the container with id ccfb0f1d242dbc6599715ff160b98107a867f4629af39bc14b23b4e536709d39 Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.130175 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.136576 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-69a02d5a1d042ae0164c178b1234e1c082f37100b2febbab0090649ee489f96d WatchSource:0}: Error finding container 69a02d5a1d042ae0164c178b1234e1c082f37100b2febbab0090649ee489f96d: Status 404 returned error can't find the container with id 69a02d5a1d042ae0164c178b1234e1c082f37100b2febbab0090649ee489f96d Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.143392 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms" Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.144210 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5978164cfdeafd74123d88033d460caa7094c07d000aeccc4cd86bbddf8852d9 WatchSource:0}: Error finding container 5978164cfdeafd74123d88033d460caa7094c07d000aeccc4cd86bbddf8852d9: Status 404 returned error can't find the container with id 5978164cfdeafd74123d88033d460caa7094c07d000aeccc4cd86bbddf8852d9 Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.159130 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-10ecd58550fad49af461f8bf6057b41632306f926dc0fc4249f937c07da29d00 WatchSource:0}: Error finding container 10ecd58550fad49af461f8bf6057b41632306f926dc0fc4249f937c07da29d00: Status 404 returned error can't find the container with id 10ecd58550fad49af461f8bf6057b41632306f926dc0fc4249f937c07da29d00 Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.393128 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.395222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.395263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.395280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.395307 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.396067 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.465205 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.465295 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.529426 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.586877 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.586962 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.623015 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"10ecd58550fad49af461f8bf6057b41632306f926dc0fc4249f937c07da29d00"} Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.624022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5978164cfdeafd74123d88033d460caa7094c07d000aeccc4cd86bbddf8852d9"} Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.624989 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"69a02d5a1d042ae0164c178b1234e1c082f37100b2febbab0090649ee489f96d"} Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.626104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"197c69f39ca2d6a488fe0bdca98108347f3ce65ef3e55a0ff9fdace75cd88f0c"} Mar 09 15:57:54 crc kubenswrapper[4831]: I0309 15:57:54.627226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ccfb0f1d242dbc6599715ff160b98107a867f4629af39bc14b23b4e536709d39"} Mar 09 15:57:54 crc kubenswrapper[4831]: W0309 15:57:54.819198 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.819606 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:54 crc kubenswrapper[4831]: E0309 15:57:54.945083 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s" Mar 09 15:57:55 crc kubenswrapper[4831]: W0309 15:57:55.175976 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:55 crc kubenswrapper[4831]: E0309 15:57:55.176088 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.197046 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.198659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.198709 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.198726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.198757 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:57:55 crc kubenswrapper[4831]: E0309 15:57:55.199260 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.528763 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.566493 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:57:55 crc kubenswrapper[4831]: E0309 15:57:55.567607 4831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.631778 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6" exitCode=0 Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.632010 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.634112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.634954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.635017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.635071 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.636477 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5" exitCode=0 Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.636566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.636654 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.637963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.638011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.638036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.639781 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640294 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640441 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.640920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.641854 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.641880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.641896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.643389 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf" exitCode=0 Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.643498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.643611 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.644679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.644707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.644721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.645824 4831 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1" exitCode=0 Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.645873 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1"} Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.645987 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.647244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.647529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:55 crc kubenswrapper[4831]: I0309 15:57:55.647627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: W0309 15:57:56.523438 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:56 crc kubenswrapper[4831]: E0309 15:57:56.523588 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.528728 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:56 crc kubenswrapper[4831]: E0309 15:57:56.546615 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.650678 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.650784 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.651796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.651827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.651836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.653914 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654072 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.654558 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.657244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.657497 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.657657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.657783 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.658720 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa" exitCode=0 Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.658829 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.658824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa"} Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.659226 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.659768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.659803 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.659813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.660540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.660554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.660562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.799385 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.800753 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.800783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.800796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:56 crc kubenswrapper[4831]: I0309 15:57:56.800817 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:57:56 crc kubenswrapper[4831]: E0309 15:57:56.801333 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Mar 09 15:57:56 crc kubenswrapper[4831]: W0309 15:57:56.856650 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Mar 09 15:57:56 crc kubenswrapper[4831]: E0309 15:57:56.856723 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.669489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d1ed6d96d4965c41b59abdb18b1a85c3caec35a3e49bdb89474ea153c9500473"} Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.669596 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.671271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.671319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.671336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673120 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3" exitCode=0 Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673218 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3"} Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673233 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673288 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673296 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.673295 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.675016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.674986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.675038 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.717884 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.718099 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.719726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.719792 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.719815 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:57 crc kubenswrapper[4831]: I0309 15:57:57.736557 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48"} Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6"} Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291"} Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679518 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679672 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.679523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18"} Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.680087 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.680373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.680436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.680449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.681019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.681044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.681055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:58 crc kubenswrapper[4831]: I0309 15:57:58.929482 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.686008 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.686009 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.685992 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129"} Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687207 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687570 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.687587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:57:59 crc kubenswrapper[4831]: I0309 15:57:59.762365 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.001770 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.003288 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.003355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.003377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.003447 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.687648 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.687714 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.691986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.692032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.692042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.692005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.692126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.692140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.736949 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:58:00 crc kubenswrapper[4831]: I0309 15:58:00.737103 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:58:01 crc kubenswrapper[4831]: I0309 15:58:01.395593 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:01 crc kubenswrapper[4831]: I0309 15:58:01.690167 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:01 crc kubenswrapper[4831]: I0309 15:58:01.692025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:01 crc kubenswrapper[4831]: I0309 15:58:01.692074 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:01 crc kubenswrapper[4831]: I0309 15:58:01.692087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.177582 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.177933 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.179809 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.179993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.180005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.779509 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.779822 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.781705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.781793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:02 crc kubenswrapper[4831]: I0309 15:58:02.781820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.086719 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.087133 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.088932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.089006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.089032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.245339 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.245634 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.247029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.247075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.247087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.255454 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.608843 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.695248 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.695278 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696631 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:03 crc kubenswrapper[4831]: I0309 15:58:03.696661 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:03 crc kubenswrapper[4831]: E0309 15:58:03.698296 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.417130 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38874->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.417205 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38874->192.168.126.11:17697: read: connection reset by peer" Mar 09 15:58:07 crc kubenswrapper[4831]: E0309 15:58:07.463987 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:07 crc kubenswrapper[4831]: W0309 15:58:07.496268 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.496417 4831 trace.go:236] Trace[258835056]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 15:57:57.495) (total time: 10000ms): Mar 09 15:58:07 crc kubenswrapper[4831]: Trace[258835056]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (15:58:07.496) Mar 09 15:58:07 crc kubenswrapper[4831]: Trace[258835056]: [10.000899473s] [10.000899473s] END Mar 09 15:58:07 crc kubenswrapper[4831]: E0309 15:58:07.496448 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.529579 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.710776 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.713914 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d1ed6d96d4965c41b59abdb18b1a85c3caec35a3e49bdb89474ea153c9500473" exitCode=255 Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.713975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d1ed6d96d4965c41b59abdb18b1a85c3caec35a3e49bdb89474ea153c9500473"} Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.714192 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.715526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.715574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.715590 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.716199 4831 scope.go:117] "RemoveContainer" containerID="d1ed6d96d4965c41b59abdb18b1a85c3caec35a3e49bdb89474ea153c9500473" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.726749 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.726976 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.728521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.728573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.728592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:07 crc kubenswrapper[4831]: W0309 15:58:07.746293 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 15:58:07 crc kubenswrapper[4831]: I0309 15:58:07.746452 4831 trace.go:236] Trace[1277175866]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 15:57:57.745) (total time: 10001ms): Mar 09 15:58:07 crc kubenswrapper[4831]: Trace[1277175866]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:58:07.746) Mar 09 15:58:07 crc kubenswrapper[4831]: Trace[1277175866]: [10.001208561s] [10.001208561s] END Mar 09 15:58:07 crc kubenswrapper[4831]: E0309 15:58:07.746487 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.377624 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.380944 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.380992 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.382014 4831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.385612 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:08 crc kubenswrapper[4831]: W0309 15:58:08.385710 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.385775 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.387835 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.387907 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 15:58:08 crc kubenswrapper[4831]: W0309 15:58:08.388388 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.388475 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.409081 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.531832 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:08Z is after 2026-02-23T05:33:13Z Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.718169 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.718760 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.720592 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" exitCode=255 Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.720638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105"} Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.720674 4831 scope.go:117] "RemoveContainer" containerID="d1ed6d96d4965c41b59abdb18b1a85c3caec35a3e49bdb89474ea153c9500473" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.720754 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.721753 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.721791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.721804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:08 crc kubenswrapper[4831]: I0309 15:58:08.722358 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:08 crc kubenswrapper[4831]: E0309 15:58:08.722570 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.531819 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:09Z is after 2026-02-23T05:33:13Z Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.724502 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.726520 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.727336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.727367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.727393 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:09 crc kubenswrapper[4831]: I0309 15:58:09.727859 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:09 crc kubenswrapper[4831]: E0309 15:58:09.728004 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:10 crc kubenswrapper[4831]: I0309 15:58:10.536096 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:10Z is after 2026-02-23T05:33:13Z Mar 09 15:58:10 crc kubenswrapper[4831]: I0309 15:58:10.737445 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:58:10 crc kubenswrapper[4831]: I0309 15:58:10.737541 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:58:11 crc kubenswrapper[4831]: W0309 15:58:11.309177 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:11Z is after 2026-02-23T05:33:13Z Mar 09 15:58:11 crc kubenswrapper[4831]: E0309 15:58:11.309292 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.403312 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.403453 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.404911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.405020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.405042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.406015 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:11 crc kubenswrapper[4831]: E0309 15:58:11.406321 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.408394 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.534098 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:11Z is after 2026-02-23T05:33:13Z Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.732531 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.733694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.733760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.733786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:11 crc kubenswrapper[4831]: I0309 15:58:11.734727 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:11 crc kubenswrapper[4831]: E0309 15:58:11.735015 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:12 crc kubenswrapper[4831]: W0309 15:58:12.283071 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:12Z is after 2026-02-23T05:33:13Z Mar 09 15:58:12 crc kubenswrapper[4831]: E0309 15:58:12.283174 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.534231 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:12Z is after 2026-02-23T05:33:13Z Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.816775 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.817142 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.819147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.819234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.819258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:12 crc kubenswrapper[4831]: I0309 15:58:12.836441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 15:58:13 crc kubenswrapper[4831]: I0309 15:58:13.531878 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:13Z is after 2026-02-23T05:33:13Z Mar 09 15:58:13 crc kubenswrapper[4831]: E0309 15:58:13.698603 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:13 crc kubenswrapper[4831]: I0309 15:58:13.737876 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:13 crc kubenswrapper[4831]: I0309 15:58:13.738598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:13 crc kubenswrapper[4831]: I0309 15:58:13.738623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:13 crc kubenswrapper[4831]: I0309 15:58:13.738634 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.532368 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:14Z is after 2026-02-23T05:33:13Z Mar 09 15:58:14 crc kubenswrapper[4831]: E0309 15:58:14.781735 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:14Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.786668 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.787742 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.787798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.787817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:14 crc kubenswrapper[4831]: I0309 15:58:14.787852 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:14 crc kubenswrapper[4831]: E0309 15:58:14.792626 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:14Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:15 crc kubenswrapper[4831]: I0309 15:58:15.533957 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:15Z is after 2026-02-23T05:33:13Z Mar 09 15:58:15 crc kubenswrapper[4831]: W0309 15:58:15.990051 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:15Z is after 2026-02-23T05:33:13Z Mar 09 15:58:15 crc kubenswrapper[4831]: E0309 15:58:15.990182 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.534503 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:16Z is after 2026-02-23T05:33:13Z Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.667493 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.667754 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.669353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.669480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.669533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.670459 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:16 crc kubenswrapper[4831]: E0309 15:58:16.670741 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:16 crc kubenswrapper[4831]: I0309 15:58:16.948796 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:58:16 crc kubenswrapper[4831]: E0309 15:58:16.952663 4831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:17 crc kubenswrapper[4831]: E0309 15:58:17.470350 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:17Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:17 crc kubenswrapper[4831]: I0309 15:58:17.533106 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:17Z is after 2026-02-23T05:33:13Z Mar 09 15:58:18 crc kubenswrapper[4831]: W0309 15:58:18.287154 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:18Z is after 2026-02-23T05:33:13Z Mar 09 15:58:18 crc kubenswrapper[4831]: E0309 15:58:18.287259 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.409791 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.410024 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.411502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.411545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.411583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.412198 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:18 crc kubenswrapper[4831]: E0309 15:58:18.412357 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:18 crc kubenswrapper[4831]: I0309 15:58:18.533975 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:18Z is after 2026-02-23T05:33:13Z Mar 09 15:58:19 crc kubenswrapper[4831]: I0309 15:58:19.533461 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:19Z is after 2026-02-23T05:33:13Z Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.533856 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:20Z is after 2026-02-23T05:33:13Z Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.737796 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.737903 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.737992 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.738206 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.739849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.739909 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.739932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.741008 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 15:58:20 crc kubenswrapper[4831]: I0309 15:58:20.741265 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9" gracePeriod=30 Mar 09 15:58:21 crc kubenswrapper[4831]: W0309 15:58:21.013868 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:21Z is after 2026-02-23T05:33:13Z Mar 09 15:58:21 crc kubenswrapper[4831]: E0309 15:58:21.013966 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.534544 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:21Z is after 2026-02-23T05:33:13Z Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.768284 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.768886 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9" exitCode=255 Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.768956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9"} Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.769003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f"} Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.769189 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.770327 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.770425 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.770453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:21 crc kubenswrapper[4831]: E0309 15:58:21.786013 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:21Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.793321 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.794376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.794423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.794440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:21 crc kubenswrapper[4831]: I0309 15:58:21.794464 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:21 crc kubenswrapper[4831]: E0309 15:58:21.797720 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.177724 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.532395 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:22Z is after 2026-02-23T05:33:13Z Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.771916 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.772779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.772803 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:22 crc kubenswrapper[4831]: I0309 15:58:22.772813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:23 crc kubenswrapper[4831]: I0309 15:58:23.533958 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:23Z is after 2026-02-23T05:33:13Z Mar 09 15:58:23 crc kubenswrapper[4831]: E0309 15:58:23.698729 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:24 crc kubenswrapper[4831]: W0309 15:58:24.466238 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:24Z is after 2026-02-23T05:33:13Z Mar 09 15:58:24 crc kubenswrapper[4831]: E0309 15:58:24.466388 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:24 crc kubenswrapper[4831]: I0309 15:58:24.532987 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:24Z is after 2026-02-23T05:33:13Z Mar 09 15:58:25 crc kubenswrapper[4831]: I0309 15:58:25.533780 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:25Z is after 2026-02-23T05:33:13Z Mar 09 15:58:26 crc kubenswrapper[4831]: I0309 15:58:26.534377 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:26Z is after 2026-02-23T05:33:13Z Mar 09 15:58:27 crc kubenswrapper[4831]: E0309 15:58:27.474376 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.531931 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:27Z is after 2026-02-23T05:33:13Z Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.737736 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.737893 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.739034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.739068 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:27 crc kubenswrapper[4831]: I0309 15:58:27.739077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.534263 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:28Z is after 2026-02-23T05:33:13Z Mar 09 15:58:28 crc kubenswrapper[4831]: E0309 15:58:28.789050 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.798386 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.799819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.799857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.799871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:28 crc kubenswrapper[4831]: I0309 15:58:28.799897 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:28 crc kubenswrapper[4831]: E0309 15:58:28.802734 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:29 crc kubenswrapper[4831]: I0309 15:58:29.532323 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:29Z is after 2026-02-23T05:33:13Z Mar 09 15:58:30 crc kubenswrapper[4831]: W0309 15:58:30.089337 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:30Z is after 2026-02-23T05:33:13Z Mar 09 15:58:30 crc kubenswrapper[4831]: E0309 15:58:30.089565 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:30 crc kubenswrapper[4831]: I0309 15:58:30.532433 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:30Z is after 2026-02-23T05:33:13Z Mar 09 15:58:30 crc kubenswrapper[4831]: I0309 15:58:30.738283 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:58:30 crc kubenswrapper[4831]: I0309 15:58:30.738369 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.534694 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:31Z is after 2026-02-23T05:33:13Z Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.621172 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.623109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.623157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.623178 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:31 crc kubenswrapper[4831]: I0309 15:58:31.624007 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.533960 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:32Z is after 2026-02-23T05:33:13Z Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.799035 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.799772 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.802005 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" exitCode=255 Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.802064 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09"} Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.802127 4831 scope.go:117] "RemoveContainer" containerID="6a90f2806b35318b08dd609ce3d6132274c6ab4ea7cbc0c575a19ec9f7540105" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.802288 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.803228 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.803261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.803274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:32 crc kubenswrapper[4831]: I0309 15:58:32.805188 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:58:32 crc kubenswrapper[4831]: E0309 15:58:32.805546 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:33 crc kubenswrapper[4831]: I0309 15:58:33.532956 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:33Z is after 2026-02-23T05:33:13Z Mar 09 15:58:33 crc kubenswrapper[4831]: E0309 15:58:33.698941 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:33 crc kubenswrapper[4831]: I0309 15:58:33.807256 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 15:58:34 crc kubenswrapper[4831]: I0309 15:58:34.210658 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:58:34 crc kubenswrapper[4831]: E0309 15:58:34.215988 4831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:34 crc kubenswrapper[4831]: E0309 15:58:34.217246 4831 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 15:58:34 crc kubenswrapper[4831]: I0309 15:58:34.536997 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:34Z is after 2026-02-23T05:33:13Z Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.533742 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:35Z is after 2026-02-23T05:33:13Z Mar 09 15:58:35 crc kubenswrapper[4831]: E0309 15:58:35.794681 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.802833 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.804266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.804322 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.804341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:35 crc kubenswrapper[4831]: I0309 15:58:35.804373 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:35 crc kubenswrapper[4831]: E0309 15:58:35.807933 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.534697 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:36Z is after 2026-02-23T05:33:13Z Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.667112 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.667272 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.668568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.668612 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.668628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:36 crc kubenswrapper[4831]: I0309 15:58:36.669253 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:58:36 crc kubenswrapper[4831]: E0309 15:58:36.669519 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:37 crc kubenswrapper[4831]: E0309 15:58:37.480619 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:37 crc kubenswrapper[4831]: I0309 15:58:37.533650 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:37Z is after 2026-02-23T05:33:13Z Mar 09 15:58:38 crc kubenswrapper[4831]: W0309 15:58:38.356803 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:38Z is after 2026-02-23T05:33:13Z Mar 09 15:58:38 crc kubenswrapper[4831]: E0309 15:58:38.357644 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.409591 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.409845 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.411473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.411550 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.411574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.412532 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:58:38 crc kubenswrapper[4831]: E0309 15:58:38.412822 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:38 crc kubenswrapper[4831]: I0309 15:58:38.534559 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:38Z is after 2026-02-23T05:33:13Z Mar 09 15:58:39 crc kubenswrapper[4831]: W0309 15:58:39.354994 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:39Z is after 2026-02-23T05:33:13Z Mar 09 15:58:39 crc kubenswrapper[4831]: E0309 15:58:39.355128 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:39 crc kubenswrapper[4831]: I0309 15:58:39.535525 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:39Z is after 2026-02-23T05:33:13Z Mar 09 15:58:40 crc kubenswrapper[4831]: I0309 15:58:40.534030 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:40Z is after 2026-02-23T05:33:13Z Mar 09 15:58:40 crc kubenswrapper[4831]: I0309 15:58:40.737395 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:58:40 crc kubenswrapper[4831]: I0309 15:58:40.737558 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:58:41 crc kubenswrapper[4831]: W0309 15:58:41.216679 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:41Z is after 2026-02-23T05:33:13Z Mar 09 15:58:41 crc kubenswrapper[4831]: E0309 15:58:41.216757 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 15:58:41 crc kubenswrapper[4831]: I0309 15:58:41.533730 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:41Z is after 2026-02-23T05:33:13Z Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.531830 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:42Z is after 2026-02-23T05:33:13Z Mar 09 15:58:42 crc kubenswrapper[4831]: E0309 15:58:42.798260 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.808707 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.810790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.810844 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.810857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:42 crc kubenswrapper[4831]: I0309 15:58:42.810890 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:42 crc kubenswrapper[4831]: E0309 15:58:42.814719 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.095186 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.095388 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.096765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.096812 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.096831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:43 crc kubenswrapper[4831]: I0309 15:58:43.533586 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:43Z is after 2026-02-23T05:33:13Z Mar 09 15:58:43 crc kubenswrapper[4831]: E0309 15:58:43.699126 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:44 crc kubenswrapper[4831]: I0309 15:58:44.533942 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:44Z is after 2026-02-23T05:33:13Z Mar 09 15:58:45 crc kubenswrapper[4831]: I0309 15:58:45.532279 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:45Z is after 2026-02-23T05:33:13Z Mar 09 15:58:46 crc kubenswrapper[4831]: I0309 15:58:46.533327 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:46Z is after 2026-02-23T05:33:13Z Mar 09 15:58:47 crc kubenswrapper[4831]: E0309 15:58:47.486857 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:47 crc kubenswrapper[4831]: I0309 15:58:47.533739 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:47Z is after 2026-02-23T05:33:13Z Mar 09 15:58:48 crc kubenswrapper[4831]: I0309 15:58:48.534516 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:48Z is after 2026-02-23T05:33:13Z Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.530784 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T15:58:49Z is after 2026-02-23T05:33:13Z Mar 09 15:58:49 crc kubenswrapper[4831]: E0309 15:58:49.802919 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.815259 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.816692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.816739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.816755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:49 crc kubenswrapper[4831]: I0309 15:58:49.816788 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:49 crc kubenswrapper[4831]: E0309 15:58:49.820882 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.532840 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.737251 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.737331 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.737386 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.737537 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.739191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.739232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.739246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.739756 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 15:58:50 crc kubenswrapper[4831]: I0309 15:58:50.739836 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f" gracePeriod=30 Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.533062 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.617062 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.619052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.619134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.619151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.620028 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:58:51 crc kubenswrapper[4831]: E0309 15:58:51.620253 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.857543 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.858981 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.859345 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f" exitCode=255 Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.859501 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f"} Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.859627 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f"} Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.859774 4831 scope.go:117] "RemoveContainer" containerID="4a986f4258cd0664514db523d44a9efccb0f106432bf7c9d6fe54af4099576f9" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.860434 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.865256 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.865290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:51 crc kubenswrapper[4831]: I0309 15:58:51.865304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.177851 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.535964 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.866588 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.868440 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.869808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.869843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:52 crc kubenswrapper[4831]: I0309 15:58:52.869855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:53 crc kubenswrapper[4831]: I0309 15:58:53.532786 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:53 crc kubenswrapper[4831]: E0309 15:58:53.699957 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:58:53 crc kubenswrapper[4831]: I0309 15:58:53.870468 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:53 crc kubenswrapper[4831]: I0309 15:58:53.871613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:53 crc kubenswrapper[4831]: I0309 15:58:53.871661 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:53 crc kubenswrapper[4831]: I0309 15:58:53.871675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:54 crc kubenswrapper[4831]: I0309 15:58:54.533884 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:55 crc kubenswrapper[4831]: I0309 15:58:55.536019 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.537040 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:56 crc kubenswrapper[4831]: E0309 15:58:56.808800 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.821902 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.824364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.824466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.824490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:56 crc kubenswrapper[4831]: I0309 15:58:56.824536 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:58:56 crc kubenswrapper[4831]: E0309 15:58:56.831437 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.492441 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37798eb45b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,LastTimestamp:2026-03-09 15:57:53.525177106 +0000 UTC m=+0.658859569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.497553 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.501385 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.504889 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.513074 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779988f7736 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.690531638 +0000 UTC m=+0.824214071,LastTimestamp:2026-03-09 15:57:53.690531638 +0000 UTC m=+0.824214071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.523656 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.71838986 +0000 UTC m=+0.852072293,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.532374 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.537780 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.718426281 +0000 UTC m=+0.852108714,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.543125 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.718436642 +0000 UTC m=+0.852119075,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.550892 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.719532672 +0000 UTC m=+0.853215105,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.557147 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.719545243 +0000 UTC m=+0.853227676,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.561845 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.719554773 +0000 UTC m=+0.853237206,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.568382 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.720481209 +0000 UTC m=+0.854163642,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.574611 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.720493309 +0000 UTC m=+0.854175742,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.579457 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.72050335 +0000 UTC m=+0.854185783,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.583870 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.720543791 +0000 UTC m=+0.854226224,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.587942 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.720557421 +0000 UTC m=+0.854239854,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.593845 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.720588762 +0000 UTC m=+0.854271195,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.598893 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.721462976 +0000 UTC m=+0.855145409,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.604198 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.721475977 +0000 UTC m=+0.855158410,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.610104 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.721486827 +0000 UTC m=+0.855169260,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.611883 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.722441144 +0000 UTC m=+0.856123577,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.615964 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.722455384 +0000 UTC m=+0.856137817,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.621067 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b377993691932\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b377993691932 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604131122 +0000 UTC m=+0.737813575,LastTimestamp:2026-03-09 15:57:53.722465885 +0000 UTC m=+0.856148318,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.626705 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b3779936860e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b3779936860e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604083941 +0000 UTC m=+0.737766404,LastTimestamp:2026-03-09 15:57:53.72300217 +0000 UTC m=+0.856684603,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.632116 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b37799368d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b37799368d8f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:53.604114672 +0000 UTC m=+0.737797125,LastTimestamp:2026-03-09 15:57:53.72301948 +0000 UTC m=+0.856701913,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.638247 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b3779b2c31db2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.13012421 +0000 UTC m=+1.263806673,LastTimestamp:2026-03-09 15:57:54.13012421 +0000 UTC m=+1.263806673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.645579 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3779b2c9e8a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.130569382 +0000 UTC m=+1.264251835,LastTimestamp:2026-03-09 15:57:54.130569382 +0000 UTC m=+1.264251835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.649966 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779b389982f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.143131695 +0000 UTC m=+1.276814158,LastTimestamp:2026-03-09 15:57:54.143131695 +0000 UTC m=+1.276814158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.655284 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3779b4726107 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.158387463 +0000 UTC m=+1.292069916,LastTimestamp:2026-03-09 15:57:54.158387463 +0000 UTC m=+1.292069916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.660380 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b3779b4e48d81 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.165869953 +0000 UTC m=+1.299552416,LastTimestamp:2026-03-09 15:57:54.165869953 +0000 UTC m=+1.299552416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.666654 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3779d682ccbe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.729888958 +0000 UTC m=+1.863571391,LastTimestamp:2026-03-09 15:57:54.729888958 +0000 UTC m=+1.863571391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.672169 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b3779d6944073 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.731032691 +0000 UTC m=+1.864715124,LastTimestamp:2026-03-09 15:57:54.731032691 +0000 UTC m=+1.864715124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.677621 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779d6abdc0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.732579854 +0000 UTC m=+1.866262297,LastTimestamp:2026-03-09 15:57:54.732579854 +0000 UTC m=+1.866262297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.682876 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3779d733fff8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.741501944 +0000 UTC m=+1.875184387,LastTimestamp:2026-03-09 15:57:54.741501944 +0000 UTC m=+1.875184387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.686820 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b3779d752487e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.74348659 +0000 UTC m=+1.877169013,LastTimestamp:2026-03-09 15:57:54.74348659 +0000 UTC m=+1.877169013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.691838 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3779d761fc5f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.744515679 +0000 UTC m=+1.878198102,LastTimestamp:2026-03-09 15:57:54.744515679 +0000 UTC m=+1.878198102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.696425 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b3779d77101d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.745500117 +0000 UTC m=+1.879182540,LastTimestamp:2026-03-09 15:57:54.745500117 +0000 UTC m=+1.879182540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.701822 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779d78ef730 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.747463472 +0000 UTC m=+1.881145895,LastTimestamp:2026-03-09 15:57:54.747463472 +0000 UTC m=+1.881145895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.705441 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779d7a4e4dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.748900572 +0000 UTC m=+1.882582995,LastTimestamp:2026-03-09 15:57:54.748900572 +0000 UTC m=+1.882582995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.709558 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3779d7df03a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.752709539 +0000 UTC m=+1.886391962,LastTimestamp:2026-03-09 15:57:54.752709539 +0000 UTC m=+1.886391962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.714549 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b3779d7fae34e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.75453627 +0000 UTC m=+1.888218703,LastTimestamp:2026-03-09 15:57:54.75453627 +0000 UTC m=+1.888218703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.718856 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779e7cc3f03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.019915011 +0000 UTC m=+2.153597464,LastTimestamp:2026-03-09 15:57:55.019915011 +0000 UTC m=+2.153597464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.727154 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779e8a64834 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.034204212 +0000 UTC m=+2.167886635,LastTimestamp:2026-03-09 15:57:55.034204212 +0000 UTC m=+2.167886635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.730341 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779e8c53c2d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.036232749 +0000 UTC m=+2.169915202,LastTimestamp:2026-03-09 15:57:55.036232749 +0000 UTC m=+2.169915202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.734104 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779f749ce26 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.279801894 +0000 UTC m=+2.413484347,LastTimestamp:2026-03-09 15:57:55.279801894 +0000 UTC m=+2.413484347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.737936 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.738190 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.738577 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779f83b589f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.295631519 +0000 UTC m=+2.429313982,LastTimestamp:2026-03-09 15:57:55.295631519 +0000 UTC m=+2.429313982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.744387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.744500 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:58:57 crc kubenswrapper[4831]: I0309 15:58:57.744520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.749905 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779f8500847 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.296987207 +0000 UTC m=+2.430669640,LastTimestamp:2026-03-09 15:57:55.296987207 +0000 UTC m=+2.430669640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.754677 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377a0564ddd6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.516456406 +0000 UTC m=+2.650138839,LastTimestamp:2026-03-09 15:57:55.516456406 +0000 UTC m=+2.650138839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.758926 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377a06413912 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.530897682 +0000 UTC m=+2.664580125,LastTimestamp:2026-03-09 15:57:55.530897682 +0000 UTC m=+2.664580125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.763332 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a0c97ec15 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.637242901 +0000 UTC m=+2.770925364,LastTimestamp:2026-03-09 15:57:55.637242901 +0000 UTC m=+2.770925364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.767511 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a0cbb91d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.639579097 +0000 UTC m=+2.773261540,LastTimestamp:2026-03-09 15:57:55.639579097 +0000 UTC m=+2.773261540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.772176 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a0d34a504 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.64751386 +0000 UTC m=+2.781196323,LastTimestamp:2026-03-09 15:57:55.64751386 +0000 UTC m=+2.781196323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.776617 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b377a0d84bf5f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.652763487 +0000 UTC m=+2.786445950,LastTimestamp:2026-03-09 15:57:55.652763487 +0000 UTC m=+2.786445950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.780439 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a1b1edfe1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.880968161 +0000 UTC m=+3.014650584,LastTimestamp:2026-03-09 15:57:55.880968161 +0000 UTC m=+3.014650584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.784226 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a1b21b3f6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.881153526 +0000 UTC m=+3.014835950,LastTimestamp:2026-03-09 15:57:55.881153526 +0000 UTC m=+3.014835950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.789124 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b377a1b2d09af openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.881896367 +0000 UTC m=+3.015578790,LastTimestamp:2026-03-09 15:57:55.881896367 +0000 UTC m=+3.015578790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.794555 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a1bdf5488 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.893580936 +0000 UTC m=+3.027263359,LastTimestamp:2026-03-09 15:57:55.893580936 +0000 UTC m=+3.027263359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.798979 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a1beecbc8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.894594504 +0000 UTC m=+3.028276927,LastTimestamp:2026-03-09 15:57:55.894594504 +0000 UTC m=+3.028276927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.803170 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b377a1bf6784a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.895097418 +0000 UTC m=+3.028779831,LastTimestamp:2026-03-09 15:57:55.895097418 +0000 UTC m=+3.028779831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.807623 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a1c103b98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.896785816 +0000 UTC m=+3.030468249,LastTimestamp:2026-03-09 15:57:55.896785816 +0000 UTC m=+3.030468249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.811840 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a1c104ce6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.896790246 +0000 UTC m=+3.030472669,LastTimestamp:2026-03-09 15:57:55.896790246 +0000 UTC m=+3.030472669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.815725 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a1c19661b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.897386523 +0000 UTC m=+3.031068946,LastTimestamp:2026-03-09 15:57:55.897386523 +0000 UTC m=+3.031068946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.819158 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a1da7df4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.923500877 +0000 UTC m=+3.057183310,LastTimestamp:2026-03-09 15:57:55.923500877 +0000 UTC m=+3.057183310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.822225 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a278077f8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.08869068 +0000 UTC m=+3.222373093,LastTimestamp:2026-03-09 15:57:56.08869068 +0000 UTC m=+3.222373093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.826580 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a27903025 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.089720869 +0000 UTC m=+3.223403292,LastTimestamp:2026-03-09 15:57:56.089720869 +0000 UTC m=+3.223403292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.830107 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a28445813 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.101527571 +0000 UTC m=+3.235209994,LastTimestamp:2026-03-09 15:57:56.101527571 +0000 UTC m=+3.235209994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.833426 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a28520ecc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.102426316 +0000 UTC m=+3.236108739,LastTimestamp:2026-03-09 15:57:56.102426316 +0000 UTC m=+3.236108739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.837180 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a2858e3f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.102874098 +0000 UTC m=+3.236556521,LastTimestamp:2026-03-09 15:57:56.102874098 +0000 UTC m=+3.236556521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.841814 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a2871d232 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.104507954 +0000 UTC m=+3.238190377,LastTimestamp:2026-03-09 15:57:56.104507954 +0000 UTC m=+3.238190377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.846131 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a321ce3ed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.266714093 +0000 UTC m=+3.400396516,LastTimestamp:2026-03-09 15:57:56.266714093 +0000 UTC m=+3.400396516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.850912 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a324291dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.269183453 +0000 UTC m=+3.402865866,LastTimestamp:2026-03-09 15:57:56.269183453 +0000 UTC m=+3.402865866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.857086 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b377a32e0094f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.279503183 +0000 UTC m=+3.413185606,LastTimestamp:2026-03-09 15:57:56.279503183 +0000 UTC m=+3.413185606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.861090 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3347fa42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.286315074 +0000 UTC m=+3.419997487,LastTimestamp:2026-03-09 15:57:56.286315074 +0000 UTC m=+3.419997487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.865900 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3353888c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.287072396 +0000 UTC m=+3.420754819,LastTimestamp:2026-03-09 15:57:56.287072396 +0000 UTC m=+3.420754819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.870722 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3eb9d77b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.478326651 +0000 UTC m=+3.612009114,LastTimestamp:2026-03-09 15:57:56.478326651 +0000 UTC m=+3.612009114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.877158 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3f87b474 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.4918181 +0000 UTC m=+3.625500523,LastTimestamp:2026-03-09 15:57:56.4918181 +0000 UTC m=+3.625500523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.883849 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3f9513a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.492694435 +0000 UTC m=+3.626376858,LastTimestamp:2026-03-09 15:57:56.492694435 +0000 UTC m=+3.626376858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.890696 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a499cd5f7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.660975095 +0000 UTC m=+3.794657538,LastTimestamp:2026-03-09 15:57:56.660975095 +0000 UTC m=+3.794657538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.895927 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a4b99df79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.694335353 +0000 UTC m=+3.828017776,LastTimestamp:2026-03-09 15:57:56.694335353 +0000 UTC m=+3.828017776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.900876 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a4c678e00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.707814912 +0000 UTC m=+3.841497335,LastTimestamp:2026-03-09 15:57:56.707814912 +0000 UTC m=+3.841497335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.906902 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a54819590 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.843738512 +0000 UTC m=+3.977420935,LastTimestamp:2026-03-09 15:57:56.843738512 +0000 UTC m=+3.977420935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.913510 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a556dd045 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.859220037 +0000 UTC m=+3.992902460,LastTimestamp:2026-03-09 15:57:56.859220037 +0000 UTC m=+3.992902460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.919364 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a86375b92 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:57.677734802 +0000 UTC m=+4.811417265,LastTimestamp:2026-03-09 15:57:57.677734802 +0000 UTC m=+4.811417265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.921034 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a9342332a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:57.896549162 +0000 UTC m=+5.030231585,LastTimestamp:2026-03-09 15:57:57.896549162 +0000 UTC m=+5.030231585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.924522 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a93b14da5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:57.903830437 +0000 UTC m=+5.037512860,LastTimestamp:2026-03-09 15:57:57.903830437 +0000 UTC m=+5.037512860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.926896 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a93bfd5e8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:57.904782824 +0000 UTC m=+5.038465247,LastTimestamp:2026-03-09 15:57:57.904782824 +0000 UTC m=+5.038465247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.930737 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377a9f8884e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.102484201 +0000 UTC m=+5.236166634,LastTimestamp:2026-03-09 15:57:58.102484201 +0000 UTC m=+5.236166634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.935608 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377aa099349d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.120354973 +0000 UTC m=+5.254037446,LastTimestamp:2026-03-09 15:57:58.120354973 +0000 UTC m=+5.254037446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.939083 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377aa0acd837 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.121642039 +0000 UTC m=+5.255324472,LastTimestamp:2026-03-09 15:57:58.121642039 +0000 UTC m=+5.255324472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.943924 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377aad9fb641 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.338885185 +0000 UTC m=+5.472567618,LastTimestamp:2026-03-09 15:57:58.338885185 +0000 UTC m=+5.472567618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.949540 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377aae9e6c12 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.355577874 +0000 UTC m=+5.489260337,LastTimestamp:2026-03-09 15:57:58.355577874 +0000 UTC m=+5.489260337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.953644 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377aaeb1e2ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.35685342 +0000 UTC m=+5.490535853,LastTimestamp:2026-03-09 15:57:58.35685342 +0000 UTC m=+5.490535853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.958237 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377abd16b50a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.598341898 +0000 UTC m=+5.732024321,LastTimestamp:2026-03-09 15:57:58.598341898 +0000 UTC m=+5.732024321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.963479 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377abdb440a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.608666788 +0000 UTC m=+5.742349241,LastTimestamp:2026-03-09 15:57:58.608666788 +0000 UTC m=+5.742349241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.967297 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377abdc42991 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.609709457 +0000 UTC m=+5.743391920,LastTimestamp:2026-03-09 15:57:58.609709457 +0000 UTC m=+5.743391920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.972602 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377ac908addf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.798749151 +0000 UTC m=+5.932431584,LastTimestamp:2026-03-09 15:57:58.798749151 +0000 UTC m=+5.932431584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.977386 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b377ac98bbdfc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:58.807338492 +0000 UTC m=+5.941020945,LastTimestamp:2026-03-09 15:57:58.807338492 +0000 UTC m=+5.941020945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.984654 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 15:58:57 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-controller-manager-crc.189b377b3c90ee91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 15:58:57 crc kubenswrapper[4831]: body: Mar 09 15:58:57 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:00.737058449 +0000 UTC m=+7.870740912,LastTimestamp:2026-03-09 15:58:00.737058449 +0000 UTC m=+7.870740912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:57 crc kubenswrapper[4831]: > Mar 09 15:58:57 crc kubenswrapper[4831]: E0309 15:58:57.990775 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377b3c928178 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:00.737161592 +0000 UTC m=+7.870844045,LastTimestamp:2026-03-09 15:58:00.737161592 +0000 UTC m=+7.870844045,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.000226 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-apiserver-crc.189b377ccabb8e7a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:38874->192.168.126.11:17697: read: connection reset by peer Mar 09 15:58:58 crc kubenswrapper[4831]: body: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:07.417183866 +0000 UTC m=+14.550866299,LastTimestamp:2026-03-09 15:58:07.417183866 +0000 UTC m=+14.550866299,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.005211 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377ccabc6194 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38874->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:07.417237908 +0000 UTC m=+14.550920341,LastTimestamp:2026-03-09 15:58:07.417237908 +0000 UTC m=+14.550920341,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.012303 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b377a3f9513a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a3f9513a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.492694435 +0000 UTC m=+3.626376858,LastTimestamp:2026-03-09 15:58:07.717796066 +0000 UTC m=+14.851478479,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.019174 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b377a4b99df79\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a4b99df79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.694335353 +0000 UTC m=+3.828017776,LastTimestamp:2026-03-09 15:58:07.915588384 +0000 UTC m=+15.049270807,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.023170 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b377a4c678e00\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377a4c678e00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:56.707814912 +0000 UTC m=+3.841497335,LastTimestamp:2026-03-09 15:58:07.924744801 +0000 UTC m=+15.058427234,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.024777 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-apiserver-crc.189b377d042de66b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 15:58:58 crc kubenswrapper[4831]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 15:58:58 crc kubenswrapper[4831]: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:08.380978795 +0000 UTC m=+15.514661218,LastTimestamp:2026-03-09 15:58:08.380978795 +0000 UTC m=+15.514661218,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.026836 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b377d042e6e16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:08.381013526 +0000 UTC m=+15.514695949,LastTimestamp:2026-03-09 15:58:08.381013526 +0000 UTC m=+15.514695949,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.031422 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a3f2ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 15:58:58 crc kubenswrapper[4831]: body: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.73752545 +0000 UTC m=+17.871207873,LastTimestamp:2026-03-09 15:58:10.73752545 +0000 UTC m=+17.871207873,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.035128 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a48b5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.737564511 +0000 UTC m=+17.871246934,LastTimestamp:2026-03-09 15:58:10.737564511 +0000 UTC m=+17.871246934,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.040557 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b377d90a3f2ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a3f2ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 15:58:58 crc kubenswrapper[4831]: body: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.73752545 +0000 UTC m=+17.871207873,LastTimestamp:2026-03-09 15:58:20.737871157 +0000 UTC m=+27.871553620,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.045165 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b377d90a48b5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a48b5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.737564511 +0000 UTC m=+17.871246934,LastTimestamp:2026-03-09 15:58:20.737954479 +0000 UTC m=+27.871636942,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.049068 4831 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377fe4e8845e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:20.741239902 +0000 UTC m=+27.874922355,LastTimestamp:2026-03-09 15:58:20.741239902 +0000 UTC m=+27.874922355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.053127 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b3779d7a4e4dc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779d7a4e4dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:54.748900572 +0000 UTC m=+1.882582995,LastTimestamp:2026-03-09 15:58:20.863245821 +0000 UTC m=+27.996928274,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.057818 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b3779e7cc3f03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779e7cc3f03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.019915011 +0000 UTC m=+2.153597464,LastTimestamp:2026-03-09 15:58:21.099591694 +0000 UTC m=+28.233274157,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.062683 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b3779e8a64834\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3779e8a64834 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:57:55.034204212 +0000 UTC m=+2.167886635,LastTimestamp:2026-03-09 15:58:21.115839941 +0000 UTC m=+28.249522404,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.067948 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b377d90a3f2ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a3f2ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 15:58:58 crc kubenswrapper[4831]: body: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.73752545 +0000 UTC m=+17.871207873,LastTimestamp:2026-03-09 15:58:30.738346509 +0000 UTC m=+37.872028972,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.071660 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b377d90a48b5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a48b5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.737564511 +0000 UTC m=+17.871246934,LastTimestamp:2026-03-09 15:58:30.738641617 +0000 UTC m=+37.872324120,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 15:58:58 crc kubenswrapper[4831]: E0309 15:58:58.077329 4831 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b377d90a3f2ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 15:58:58 crc kubenswrapper[4831]: &Event{ObjectMeta:{kube-controller-manager-crc.189b377d90a3f2ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 15:58:58 crc kubenswrapper[4831]: body: Mar 09 15:58:58 crc kubenswrapper[4831]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 15:58:10.73752545 +0000 UTC m=+17.871207873,LastTimestamp:2026-03-09 15:58:40.737514749 +0000 UTC m=+47.871197212,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 15:58:58 crc kubenswrapper[4831]: > Mar 09 15:58:58 crc kubenswrapper[4831]: I0309 15:58:58.533262 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:58:59 crc kubenswrapper[4831]: I0309 15:58:59.536107 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.400258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.400450 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.401582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.401634 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.401645 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:00 crc kubenswrapper[4831]: I0309 15:59:00.534110 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:01 crc kubenswrapper[4831]: I0309 15:59:01.532671 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.189997 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.190357 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.191468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.191595 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.191666 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:02 crc kubenswrapper[4831]: I0309 15:59:02.532519 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.534138 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:03 crc kubenswrapper[4831]: E0309 15:59:03.700602 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:03 crc kubenswrapper[4831]: E0309 15:59:03.814282 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.831703 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.833078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.833121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.833136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:03 crc kubenswrapper[4831]: I0309 15:59:03.833173 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:59:03 crc kubenswrapper[4831]: E0309 15:59:03.837594 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 15:59:04 crc kubenswrapper[4831]: I0309 15:59:04.532728 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:05 crc kubenswrapper[4831]: I0309 15:59:05.533178 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.218803 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.235970 4831 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.533201 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.617263 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.618598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.618670 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.618684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.619545 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.901787 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.903183 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40"} Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.903418 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.904804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.904863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:06 crc kubenswrapper[4831]: I0309 15:59:06.904880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:07 crc kubenswrapper[4831]: W0309 15:59:07.490666 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 15:59:07 crc kubenswrapper[4831]: E0309 15:59:07.490724 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.529599 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.908670 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.910283 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.912954 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" exitCode=255 Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.912986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40"} Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.913063 4831 scope.go:117] "RemoveContainer" containerID="be2eeb69752eec49a7d6daf82903ef6561eaecd5032e523db704d3a4fc822a09" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.913257 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.914612 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.914666 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.914686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:07 crc kubenswrapper[4831]: I0309 15:59:07.915609 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:07 crc kubenswrapper[4831]: E0309 15:59:07.916005 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.409933 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.536751 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.562165 4831 csr.go:261] certificate signing request csr-4frb9 is approved, waiting to be issued Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.569816 4831 csr.go:257] certificate signing request csr-4frb9 is issued Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.666366 4831 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.917117 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.918635 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.919677 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.919733 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.919758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:08 crc kubenswrapper[4831]: I0309 15:59:08.920789 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:08 crc kubenswrapper[4831]: E0309 15:59:08.921057 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:59:09 crc kubenswrapper[4831]: I0309 15:59:09.274047 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 15:59:09 crc kubenswrapper[4831]: I0309 15:59:09.366681 4831 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 15:59:09 crc kubenswrapper[4831]: W0309 15:59:09.367016 4831 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 09 15:59:09 crc kubenswrapper[4831]: I0309 15:59:09.571297 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-19 13:34:25.320575155 +0000 UTC Mar 09 15:59:09 crc kubenswrapper[4831]: I0309 15:59:09.571372 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6117h35m15.749209631s for next certificate rotation Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.837670 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.839447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.839538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.839561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.839704 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.850560 4831 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.850870 4831 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.850906 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.855452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.855523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.855545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.855575 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.855597 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:10Z","lastTransitionTime":"2026-03-09T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.875010 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.887166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.887479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.887720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.887929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.888318 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:10Z","lastTransitionTime":"2026-03-09T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.906144 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.916972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.917031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.917056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.917085 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.917107 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:10Z","lastTransitionTime":"2026-03-09T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.928562 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.939775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.939946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.940043 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.940136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:10 crc kubenswrapper[4831]: I0309 15:59:10.940235 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:10Z","lastTransitionTime":"2026-03-09T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.953564 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.954020 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 15:59:10 crc kubenswrapper[4831]: E0309 15:59:10.954125 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.054323 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.155269 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.256207 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.356393 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.457376 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.557952 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.658813 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.759385 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.860272 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:11 crc kubenswrapper[4831]: E0309 15:59:11.961476 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.061676 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.162698 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.263118 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.363738 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.464234 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.564639 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.665461 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.766521 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.867490 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:12 crc kubenswrapper[4831]: E0309 15:59:12.967711 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.068141 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.168774 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.269242 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.369749 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.470885 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.571132 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.672054 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.701683 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.773163 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.873974 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:13 crc kubenswrapper[4831]: E0309 15:59:13.974815 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.074934 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.175464 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.275598 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.376368 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.476845 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.577483 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.678511 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.778872 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.879627 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:14 crc kubenswrapper[4831]: E0309 15:59:14.980764 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: I0309 15:59:15.032357 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.081212 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.182441 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.283504 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.383799 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.485190 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.585699 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.687120 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.788550 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.889466 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:15 crc kubenswrapper[4831]: E0309 15:59:15.990131 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.090500 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.191493 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.292346 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.393327 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.494448 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.595080 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.616907 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.618117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.618160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.618172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.667263 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.667866 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.669455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.669656 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.669795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:16 crc kubenswrapper[4831]: I0309 15:59:16.670797 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.671213 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.695605 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.796790 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.897968 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:16 crc kubenswrapper[4831]: E0309 15:59:16.998163 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: I0309 15:59:17.047714 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.098372 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.199175 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.300370 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.401356 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.502378 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.603596 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.704316 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.804918 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:17 crc kubenswrapper[4831]: E0309 15:59:17.905992 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.006141 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.106384 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.207356 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.308266 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.409421 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.510189 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.610708 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.711567 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.812587 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:18 crc kubenswrapper[4831]: E0309 15:59:18.912725 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.013841 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.114889 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.216022 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.317137 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.418074 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.519016 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.619571 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.719794 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.820160 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:19 crc kubenswrapper[4831]: E0309 15:59:19.921096 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.022132 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.123050 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.223868 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.324760 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.425192 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.525335 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.625842 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.726963 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.828138 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:20 crc kubenswrapper[4831]: E0309 15:59:20.928570 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.029117 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.102598 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.106651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.106675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.106687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.106703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.106714 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:21Z","lastTransitionTime":"2026-03-09T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.115132 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.118805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.118843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.118852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.118861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.118869 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:21Z","lastTransitionTime":"2026-03-09T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.131799 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.135273 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.135308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.135315 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.135328 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.135337 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:21Z","lastTransitionTime":"2026-03-09T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.152125 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.156738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.156806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.156828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.156855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.156874 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:21Z","lastTransitionTime":"2026-03-09T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.182808 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.182934 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.182961 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.283853 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.384846 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.485454 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.585803 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.617335 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.618947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.618981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:21 crc kubenswrapper[4831]: I0309 15:59:21.618990 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.686074 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.787012 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.888161 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:21 crc kubenswrapper[4831]: E0309 15:59:21.989185 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.090057 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.190383 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.291215 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.392143 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.493356 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.594215 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.694463 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.795047 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.895976 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:22 crc kubenswrapper[4831]: E0309 15:59:22.997246 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.097490 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.198011 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.298630 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.399026 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.500078 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.601217 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.702325 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.702448 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.803320 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:23 crc kubenswrapper[4831]: E0309 15:59:23.903871 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.004529 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.104637 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.205344 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.305865 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.406597 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.507721 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.608366 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.708450 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.808557 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:24 crc kubenswrapper[4831]: E0309 15:59:24.909034 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.009666 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.110123 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.211231 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.312253 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.412592 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.512756 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.613258 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.713569 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.814040 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:25 crc kubenswrapper[4831]: E0309 15:59:25.915187 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.016258 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.117118 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.217450 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.318421 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.419558 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.519905 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.620456 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.721089 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.821767 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:26 crc kubenswrapper[4831]: E0309 15:59:26.922702 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.023459 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.124182 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.224346 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.324987 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.425309 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.525818 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.625913 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.726617 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.827599 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:27 crc kubenswrapper[4831]: E0309 15:59:27.928484 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.029562 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.129831 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.230623 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.330741 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.431561 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.532088 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.633291 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.734295 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.835478 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:28 crc kubenswrapper[4831]: E0309 15:59:28.936121 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.036559 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.136654 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.237612 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.337740 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.438766 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.539056 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.640076 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.740273 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.841375 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:29 crc kubenswrapper[4831]: E0309 15:59:29.942330 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.043657 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.144012 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.244456 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.345662 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.445936 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.546789 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.647616 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.748479 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.848948 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:30 crc kubenswrapper[4831]: E0309 15:59:30.950334 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.051472 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.151597 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.252814 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.353493 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.408492 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.413878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.413941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.413963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.413988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.414004 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:31Z","lastTransitionTime":"2026-03-09T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.430179 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.435166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.435227 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.435247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.435271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.435290 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:31Z","lastTransitionTime":"2026-03-09T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.451149 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.456135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.456361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.456793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.457275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.457554 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:31Z","lastTransitionTime":"2026-03-09T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.473485 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.479736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.480082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.480277 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.480533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.480750 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:31Z","lastTransitionTime":"2026-03-09T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.500074 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.500728 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.500883 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.601738 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.617341 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.618442 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.618473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.618482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:31 crc kubenswrapper[4831]: I0309 15:59:31.618935 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.619084 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.702105 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.803582 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:31 crc kubenswrapper[4831]: E0309 15:59:31.904119 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.005086 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.105287 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.205701 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.306586 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.407043 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.508034 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.609099 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.709965 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.811360 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:32 crc kubenswrapper[4831]: E0309 15:59:32.912530 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.013207 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.113332 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.213916 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.314640 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.415724 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.516086 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.617051 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.703304 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.717187 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.817794 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:33 crc kubenswrapper[4831]: E0309 15:59:33.918273 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.018872 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.119593 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.220547 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.321444 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.421576 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.521782 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.622689 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.723583 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.823833 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:34 crc kubenswrapper[4831]: E0309 15:59:34.924470 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.025147 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.126102 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.226486 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.326641 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.426903 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.528044 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.628619 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.728743 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.829294 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:35 crc kubenswrapper[4831]: E0309 15:59:35.929858 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.030797 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.131899 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.232257 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.332663 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.433271 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.533485 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.634289 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.735195 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.836350 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:36 crc kubenswrapper[4831]: E0309 15:59:36.936760 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.037621 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.137790 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.238692 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.339531 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.440593 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.541223 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.642147 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.743091 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.843658 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:37 crc kubenswrapper[4831]: E0309 15:59:37.943867 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.044717 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.144845 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.245996 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.346437 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.446978 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.547210 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.647855 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.748128 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.848819 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:38 crc kubenswrapper[4831]: E0309 15:59:38.949612 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.049809 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.150414 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.250564 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.351106 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.452127 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.553023 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.653446 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.754657 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.855657 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:39 crc kubenswrapper[4831]: E0309 15:59:39.956426 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.056579 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.157332 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.259064 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.360117 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.460598 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.561465 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.662587 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.763285 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.863893 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:40 crc kubenswrapper[4831]: E0309 15:59:40.964940 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.065430 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.166193 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.267289 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.367808 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.468229 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.523595 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.528994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.529063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.529101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.529133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.529155 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:41Z","lastTransitionTime":"2026-03-09T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.545358 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.551131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.551194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.551223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.551256 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.551280 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:41Z","lastTransitionTime":"2026-03-09T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.568495 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.574391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.574510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.574536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.574571 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.574594 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:41Z","lastTransitionTime":"2026-03-09T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.592670 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.598056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.598116 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.598128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.598147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:41 crc kubenswrapper[4831]: I0309 15:59:41.598160 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:41Z","lastTransitionTime":"2026-03-09T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.614905 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.615272 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.615328 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.716287 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.817128 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:41 crc kubenswrapper[4831]: E0309 15:59:41.917326 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.017459 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.118104 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.218327 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.319511 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.420391 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.521493 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: I0309 15:59:42.616910 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:42 crc kubenswrapper[4831]: I0309 15:59:42.618171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:42 crc kubenswrapper[4831]: I0309 15:59:42.618305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:42 crc kubenswrapper[4831]: I0309 15:59:42.618422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:42 crc kubenswrapper[4831]: I0309 15:59:42.619000 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.619236 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.622551 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.723298 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.823698 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:42 crc kubenswrapper[4831]: E0309 15:59:42.924425 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.025142 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.125960 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.226990 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.327971 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: I0309 15:59:43.427097 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.428639 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.529375 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.630610 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.703510 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.772790 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.873227 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:43 crc kubenswrapper[4831]: E0309 15:59:43.974201 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.075337 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.176507 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.277719 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.378432 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.479392 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.580630 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.681801 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.782854 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.883461 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:44 crc kubenswrapper[4831]: E0309 15:59:44.984376 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.085294 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.186244 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.287276 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.388947 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.489610 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.589802 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.691160 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.792520 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.893287 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:45 crc kubenswrapper[4831]: E0309 15:59:45.994136 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.095174 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.195968 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.296549 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.397462 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.498193 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.599435 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.700046 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.801125 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:46 crc kubenswrapper[4831]: E0309 15:59:46.901914 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.002456 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.102757 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.203270 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.304329 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.405227 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.505749 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.606848 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.707549 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.807819 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:47 crc kubenswrapper[4831]: E0309 15:59:47.908899 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.009358 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.109969 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.211138 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.311920 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.413166 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.514014 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.614186 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.714800 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.814917 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:48 crc kubenswrapper[4831]: E0309 15:59:48.915987 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.017111 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.117642 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.218115 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.319249 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.420468 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.521498 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.622336 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.722794 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.823364 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:49 crc kubenswrapper[4831]: E0309 15:59:49.924103 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.025332 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.125698 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.226130 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.326818 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.427870 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.528235 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.629019 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.729158 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.829802 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:50 crc kubenswrapper[4831]: E0309 15:59:50.930179 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.030567 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.131434 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.231757 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.332485 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.432796 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.533929 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.634879 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.735248 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.835937 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.937036 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: E0309 15:59:51.987393 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 15:59:51 crc kubenswrapper[4831]: I0309 15:59:51.993483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:51 crc kubenswrapper[4831]: I0309 15:59:51.993546 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:51 crc kubenswrapper[4831]: I0309 15:59:51.993564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:51 crc kubenswrapper[4831]: I0309 15:59:51.993590 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:51 crc kubenswrapper[4831]: I0309 15:59:51.993608 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:51Z","lastTransitionTime":"2026-03-09T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.007627 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.013247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.013302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.013321 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.013345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.013359 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:52Z","lastTransitionTime":"2026-03-09T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.029189 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.034317 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.034365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.034381 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.034420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.034432 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:52Z","lastTransitionTime":"2026-03-09T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.048725 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.052834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.052898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.052913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.053172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 15:59:52 crc kubenswrapper[4831]: I0309 15:59:52.053188 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T15:59:52Z","lastTransitionTime":"2026-03-09T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.066207 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.066362 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.066388 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.166558 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.267514 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.368351 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.468556 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.569504 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.669910 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.770269 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.871362 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:52 crc kubenswrapper[4831]: E0309 15:59:52.972497 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.072954 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.174054 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.275168 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.375901 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.476904 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.577601 4831 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.704227 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 15:59:53 crc kubenswrapper[4831]: E0309 15:59:53.722817 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 15:59:57 crc kubenswrapper[4831]: I0309 15:59:57.616479 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:57 crc kubenswrapper[4831]: I0309 15:59:57.617822 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:57 crc kubenswrapper[4831]: I0309 15:59:57.617887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:57 crc kubenswrapper[4831]: I0309 15:59:57.617911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:57 crc kubenswrapper[4831]: I0309 15:59:57.619359 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.066993 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.069144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d"} Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.069333 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.070551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.070591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 15:59:58 crc kubenswrapper[4831]: I0309 15:59:58.070607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 15:59:58 crc kubenswrapper[4831]: E0309 15:59:58.724565 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.076142 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.077694 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.080235 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" exitCode=255 Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.080331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d"} Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.080447 4831 scope.go:117] "RemoveContainer" containerID="e422d48461e0989e5072c1c68ebf394276c3c0856f045f508a296f3c80b2ea40" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.080626 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.081670 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.081728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.081750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:00 crc kubenswrapper[4831]: I0309 16:00:00.082960 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:00 crc kubenswrapper[4831]: E0309 16:00:00.083315 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:01 crc kubenswrapper[4831]: I0309 16:00:01.086005 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 16:00:01 crc kubenswrapper[4831]: I0309 16:00:01.825776 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.301833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.301919 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.301941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.301967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.301989 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:02Z","lastTransitionTime":"2026-03-09T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.318216 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.325343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.325436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.325446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.325469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.325484 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:02Z","lastTransitionTime":"2026-03-09T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.340844 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.346157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.346226 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.346240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.346263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.346285 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:02Z","lastTransitionTime":"2026-03-09T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.363109 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.369286 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.369469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.369581 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.369696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.369818 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:02Z","lastTransitionTime":"2026-03-09T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.381730 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.386391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.386523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.386594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.386662 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.386751 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:02Z","lastTransitionTime":"2026-03-09T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.402424 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.402993 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.589828 4831 apiserver.go:52] "Watching apiserver" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.598011 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.598730 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2597x","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6","openshift-multus/multus-additional-cni-plugins-sdswt","openshift-machine-config-operator/machine-config-daemon-4mvxc","openshift-multus/multus-9c746","openshift-network-diagnostics/network-check-target-xd92c","openshift-dns/node-resolver-46nwt","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-7jxjf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-image-registry/node-ca-kkb76"] Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.599340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.599375 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.599469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.599645 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.599807 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.600562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.601129 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.601675 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.601967 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.602018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.602159 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.602527 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.602228 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.602714 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.602862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.603172 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.603214 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.603512 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.603686 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.604049 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.604385 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.605195 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.605443 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.607707 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.607768 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.607707 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.608311 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.612366 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.618591 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619202 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619295 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619356 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619458 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619491 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.619965 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620057 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620095 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620143 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620204 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620313 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620458 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620709 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.620928 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.621479 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.621881 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622014 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622111 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622196 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622293 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622328 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622383 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622433 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622443 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622510 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.622604 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.634535 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.638720 4831 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.651452 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.663914 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.678449 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.695519 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698218 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698322 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698388 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698483 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698535 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698586 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698645 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698699 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698794 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698847 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698902 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.698990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699023 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699054 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699120 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699157 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699171 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699279 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699493 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699539 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699565 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699590 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699616 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699663 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699687 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699711 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699737 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699758 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699869 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.699909 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700309 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700436 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700497 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700550 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.700945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701030 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701077 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701123 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701160 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701193 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701226 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701261 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701297 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701335 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701372 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701433 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701470 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701503 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701535 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701565 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701691 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701698 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701727 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701761 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701767 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701830 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701877 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.701884 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702075 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702209 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702215 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702242 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702267 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702276 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702591 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702608 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702680 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702814 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702848 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702946 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702838 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.702988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703150 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703186 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703741 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703863 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703899 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704014 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704051 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704137 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704197 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703718 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.703995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704323 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706236 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706271 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706306 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706615 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706679 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706892 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706927 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706964 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707011 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707050 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707128 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707174 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707299 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707336 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707374 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707443 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707668 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708772 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708811 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708958 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709012 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709089 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709127 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709203 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709242 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709319 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709359 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709430 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709506 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709543 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709595 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709640 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709745 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709829 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709885 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709944 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710096 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710135 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710172 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710209 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710260 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710303 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710385 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710475 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710588 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710628 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710675 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710715 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710803 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710846 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710891 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710942 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710986 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711035 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711077 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711155 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711195 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711278 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711319 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711358 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711427 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711649 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711857 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704665 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704818 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.704802 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705551 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705643 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705308 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705738 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.705294 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.706300 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707373 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707468 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707554 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707785 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.707810 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.708298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709842 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709850 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.709834 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710075 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710252 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710668 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710717 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710815 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710971 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711236 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711303 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711408 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711494 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711745 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711759 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.712733 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.711952 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.712889 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.712953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.712988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.712993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713334 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713359 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713547 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713914 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.713987 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.714123 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.214099844 +0000 UTC m=+130.347782367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715313 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715380 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715734 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715776 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.715810 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716038 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716280 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716297 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716380 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.716695 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.717010 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.717193 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.717449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.717558 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.717993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.718217 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.718446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.718487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.718743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.718576 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719383 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719725 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719803 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.719897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720057 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720113 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720638 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.720683 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.710688 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721069 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721667 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721847 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721908 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721973 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.721976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722036 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722060 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-os-release\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722237 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwpz\" (UniqueName: \"kubernetes.io/projected/bdf8f784-8094-4b1c-96bb-f7997430a0ea-kube-api-access-jqwpz\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-rootfs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722316 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-system-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722334 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-daemon-config\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722432 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722511 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g99g6\" (UniqueName: \"kubernetes.io/projected/b6f55879-dc86-45fb-8f15-9294bea295d7-kube-api-access-g99g6\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722547 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722557 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df601c5a-632b-476d-aa81-12f31472e452-serviceca\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722582 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-k8s-cni-cncf-io\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.723124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-system-cni-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.723185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.723223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722830 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.723489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp564\" (UniqueName: \"kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.722389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.723574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6hd\" (UniqueName: \"kubernetes.io/projected/df601c5a-632b-476d-aa81-12f31472e452-kube-api-access-zs6hd\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpxs\" (UniqueName: \"kubernetes.io/projected/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-kube-api-access-znpxs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724624 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-etc-kubernetes\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cnibin\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-binary-copy\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.724994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725019 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725007 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725042 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-netns\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726609 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726640 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6f55879-dc86-45fb-8f15-9294bea295d7-hosts-file\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726658 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725298 4831 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726020 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726678 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-bin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptbm\" (UniqueName: \"kubernetes.io/projected/c53277d4-7695-47e5-bacc-e6ab6dca1501-kube-api-access-nptbm\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4msk\" (UniqueName: \"kubernetes.io/projected/b1de3c2a-8954-4286-aa94-b16d80cf28ad-kube-api-access-r4msk\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726812 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-socket-dir-parent\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726852 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.725815 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726139 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghhg\" (UniqueName: \"kubernetes.io/projected/ae817cb5-b8a8-47c9-aa49-53a975c9329a-kube-api-access-8ghhg\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-proxy-tls\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726948 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-conf-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-multus-certs\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.726989 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727103 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727162 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-os-release\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727390 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df601c5a-632b-476d-aa81-12f31472e452-host\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727543 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-multus\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727592 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727776 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-kubelet\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-hostroot\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.727968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728050 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728084 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.728123 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.728200 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.228179223 +0000 UTC m=+130.361861666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-mcd-auth-proxy-config\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-cnibin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-cni-binary-copy\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728644 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728722 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728972 4831 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.728990 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729005 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729047 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729050 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729066 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729137 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.729173 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729190 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.729724 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.730517 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.730674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.730907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.732151 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.736876 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.736907 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.737091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.739330 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.739370 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.739385 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.739460 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.747208 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.748779 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.748836 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.748874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.749058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.749138 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.749134 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.749356 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.754577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.754432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.754665 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.754747 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.750286 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.754225 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.750255 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.755329 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.755563 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.755686 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756065 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756206 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756274 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.756975 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757378 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.757682 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.229262583 +0000 UTC m=+130.362944996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757719 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757738 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757759 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757772 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757785 4831 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757801 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757815 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.757936 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758012 4831 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.758051 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.258026708 +0000 UTC m=+130.391709141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758074 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758114 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758138 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758168 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758206 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758227 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758247 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758263 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758284 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758343 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.758416 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.258366218 +0000 UTC m=+130.392048641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758455 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758485 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758507 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758697 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758713 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758725 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758737 4831 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758798 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758821 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758835 4831 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758855 4831 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758876 4831 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758891 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.758906 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759111 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759142 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759185 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759250 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759264 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759275 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759283 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759297 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759415 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759444 4831 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759459 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759477 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759491 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759517 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759802 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759877 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.759931 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760081 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760104 4831 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760114 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760131 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760145 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760159 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760251 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760262 4831 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760274 4831 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760282 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760291 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760302 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760311 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760322 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760340 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760787 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760815 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760829 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760840 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.760977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761335 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761859 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761915 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761942 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761964 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761979 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.761993 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762012 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762030 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762044 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762060 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762078 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762092 4831 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762104 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762119 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762136 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762152 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762164 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762177 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762194 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762225 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762238 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762255 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762268 4831 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762287 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762299 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762316 4831 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762329 4831 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762352 4831 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762364 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762392 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762434 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762450 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762465 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762481 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762495 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762516 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762644 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762675 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762693 4831 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762709 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762732 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762750 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762765 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762955 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762961 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.762991 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.763009 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.764670 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765842 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.763024 4831 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765928 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765954 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765970 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.765983 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766002 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766015 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766028 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766081 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766099 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766114 4831 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766134 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766224 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766241 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766267 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766284 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766296 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766308 4831 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766318 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766333 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766349 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766360 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766382 4831 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766407 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766432 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766442 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766454 4831 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766465 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766474 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.766484 4831 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.767666 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.768652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.772001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.772462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.779408 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.781254 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.787309 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.795490 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.798949 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.802866 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.816004 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.824197 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.832132 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.840652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp564\" (UniqueName: \"kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867086 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867365 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867540 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpxs\" (UniqueName: \"kubernetes.io/projected/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-kube-api-access-znpxs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867632 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6hd\" (UniqueName: \"kubernetes.io/projected/df601c5a-632b-476d-aa81-12f31472e452-kube-api-access-zs6hd\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-etc-kubernetes\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867757 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cnibin\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867823 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-binary-copy\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-etc-kubernetes\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cnibin\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.867985 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868022 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-netns\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868058 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6f55879-dc86-45fb-8f15-9294bea295d7-hosts-file\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868076 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-bin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptbm\" (UniqueName: \"kubernetes.io/projected/c53277d4-7695-47e5-bacc-e6ab6dca1501-kube-api-access-nptbm\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4msk\" (UniqueName: \"kubernetes.io/projected/b1de3c2a-8954-4286-aa94-b16d80cf28ad-kube-api-access-r4msk\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-socket-dir-parent\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-bin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868322 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-netns\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868356 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghhg\" (UniqueName: \"kubernetes.io/projected/ae817cb5-b8a8-47c9-aa49-53a975c9329a-kube-api-access-8ghhg\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868486 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-proxy-tls\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868550 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-conf-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868601 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-socket-dir-parent\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-multus-certs\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868633 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-binary-copy\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-conf-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868744 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-multus-certs\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868792 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868849 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-os-release\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868906 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6f55879-dc86-45fb-8f15-9294bea295d7-hosts-file\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868919 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df601c5a-632b-476d-aa81-12f31472e452-host\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868931 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df601c5a-632b-476d-aa81-12f31472e452-host\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.868995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-os-release\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869002 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-multus\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-cni-multus\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869228 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-kubelet\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-hostroot\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869464 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-cni-binary-copy\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-hostroot\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869687 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-var-lib-kubelet\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-mcd-auth-proxy-config\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.869960 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-cnibin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870155 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-os-release\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwpz\" (UniqueName: \"kubernetes.io/projected/bdf8f784-8094-4b1c-96bb-f7997430a0ea-kube-api-access-jqwpz\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.870735 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-os-release\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-cnibin\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871470 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-mcd-auth-proxy-config\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.871474 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871621 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871753 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g99g6\" (UniqueName: \"kubernetes.io/projected/b6f55879-dc86-45fb-8f15-9294bea295d7-kube-api-access-g99g6\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-rootfs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.871932 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-system-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-daemon-config\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: E0309 16:00:02.872122 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:03.372006688 +0000 UTC m=+130.505689111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872325 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872590 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-rootfs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-system-cni-dir\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.873269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1de3c2a-8954-4286-aa94-b16d80cf28ad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.873747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.874483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.874828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.872883 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.874939 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df601c5a-632b-476d-aa81-12f31472e452-serviceca\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.877684 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-proxy-tls\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.878083 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae817cb5-b8a8-47c9-aa49-53a975c9329a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885159 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-cni-binary-copy\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df601c5a-632b-476d-aa81-12f31472e452-serviceca\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-k8s-cni-cncf-io\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885667 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-system-cni-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885758 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885773 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885788 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885804 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885819 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885832 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885846 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885864 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885878 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885889 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885902 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885915 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885930 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885942 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885956 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885968 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885979 4831 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.885994 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886006 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886018 4831 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886031 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886044 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886059 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886072 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886085 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886098 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886113 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886126 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886138 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886153 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886165 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886177 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886190 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886203 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886214 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886226 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886239 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886251 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886262 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886275 4831 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886287 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886301 4831 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886313 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886365 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1de3c2a-8954-4286-aa94-b16d80cf28ad-system-cni-dir\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886457 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c53277d4-7695-47e5-bacc-e6ab6dca1501-host-run-k8s-cni-cncf-io\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.886956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c53277d4-7695-47e5-bacc-e6ab6dca1501-multus-daemon-config\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.890950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghhg\" (UniqueName: \"kubernetes.io/projected/ae817cb5-b8a8-47c9-aa49-53a975c9329a-kube-api-access-8ghhg\") pod \"ovnkube-control-plane-749d76644c-zq8r6\" (UID: \"ae817cb5-b8a8-47c9-aa49-53a975c9329a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.895341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptbm\" (UniqueName: \"kubernetes.io/projected/c53277d4-7695-47e5-bacc-e6ab6dca1501-kube-api-access-nptbm\") pod \"multus-9c746\" (UID: \"c53277d4-7695-47e5-bacc-e6ab6dca1501\") " pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.895913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwpz\" (UniqueName: \"kubernetes.io/projected/bdf8f784-8094-4b1c-96bb-f7997430a0ea-kube-api-access-jqwpz\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.899673 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6hd\" (UniqueName: \"kubernetes.io/projected/df601c5a-632b-476d-aa81-12f31472e452-kube-api-access-zs6hd\") pod \"node-ca-kkb76\" (UID: \"df601c5a-632b-476d-aa81-12f31472e452\") " pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.902687 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpxs\" (UniqueName: \"kubernetes.io/projected/a1a80160-b9c7-4ecc-99be-4438a7c6ad9c-kube-api-access-znpxs\") pod \"machine-config-daemon-4mvxc\" (UID: \"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\") " pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.902827 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp564\" (UniqueName: \"kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564\") pod \"ovnkube-node-7jxjf\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.917428 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g99g6\" (UniqueName: \"kubernetes.io/projected/b6f55879-dc86-45fb-8f15-9294bea295d7-kube-api-access-g99g6\") pod \"node-resolver-46nwt\" (UID: \"b6f55879-dc86-45fb-8f15-9294bea295d7\") " pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.918133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4msk\" (UniqueName: \"kubernetes.io/projected/b1de3c2a-8954-4286-aa94-b16d80cf28ad-kube-api-access-r4msk\") pod \"multus-additional-cni-plugins-sdswt\" (UID: \"b1de3c2a-8954-4286-aa94-b16d80cf28ad\") " pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.925330 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.942690 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.958198 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.970900 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9c746" Mar 09 16:00:02 crc kubenswrapper[4831]: W0309 16:00:02.979185 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-21305c48c687092779b6780ad00dfe6270cfd0a2c3f1576e3987f5722d418963 WatchSource:0}: Error finding container 21305c48c687092779b6780ad00dfe6270cfd0a2c3f1576e3987f5722d418963: Status 404 returned error can't find the container with id 21305c48c687092779b6780ad00dfe6270cfd0a2c3f1576e3987f5722d418963 Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.985321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:02 crc kubenswrapper[4831]: W0309 16:00:02.986575 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc53277d4_7695_47e5_bacc_e6ab6dca1501.slice/crio-1cefce70b7f23d5fcaa28f764ee733fde9ae12c53fe6fdb7065fd7cc6327e72f WatchSource:0}: Error finding container 1cefce70b7f23d5fcaa28f764ee733fde9ae12c53fe6fdb7065fd7cc6327e72f: Status 404 returned error can't find the container with id 1cefce70b7f23d5fcaa28f764ee733fde9ae12c53fe6fdb7065fd7cc6327e72f Mar 09 16:00:02 crc kubenswrapper[4831]: I0309 16:00:02.994346 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-46nwt" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.005273 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.017766 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:00:03 crc kubenswrapper[4831]: W0309 16:00:03.028501 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod498bff7b_8be5_4e87_8717_0de7f7a8b877.slice/crio-b1659cb6d435b1f14250f65dff28760dfec76d127b9f93b040e2404abe8dcd21 WatchSource:0}: Error finding container b1659cb6d435b1f14250f65dff28760dfec76d127b9f93b040e2404abe8dcd21: Status 404 returned error can't find the container with id b1659cb6d435b1f14250f65dff28760dfec76d127b9f93b040e2404abe8dcd21 Mar 09 16:00:03 crc kubenswrapper[4831]: W0309 16:00:03.033289 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f55879_dc86_45fb_8f15_9294bea295d7.slice/crio-e7a23073c4421ebfd662faa9f1bcfbf1cebf97ddfaa3a5e6dc94324eade4f59b WatchSource:0}: Error finding container e7a23073c4421ebfd662faa9f1bcfbf1cebf97ddfaa3a5e6dc94324eade4f59b: Status 404 returned error can't find the container with id e7a23073c4421ebfd662faa9f1bcfbf1cebf97ddfaa3a5e6dc94324eade4f59b Mar 09 16:00:03 crc kubenswrapper[4831]: W0309 16:00:03.035038 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae817cb5_b8a8_47c9_aa49_53a975c9329a.slice/crio-2a7696e76e8d1ec53fcc1dbf23b5cedc17938e55a036fb8bac4704eb809b24d5 WatchSource:0}: Error finding container 2a7696e76e8d1ec53fcc1dbf23b5cedc17938e55a036fb8bac4704eb809b24d5: Status 404 returned error can't find the container with id 2a7696e76e8d1ec53fcc1dbf23b5cedc17938e55a036fb8bac4704eb809b24d5 Mar 09 16:00:03 crc kubenswrapper[4831]: W0309 16:00:03.037575 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a80160_b9c7_4ecc_99be_4438a7c6ad9c.slice/crio-8dccc43b52b0e40a2228ec99bf7d10e4c493c2e44a62732e69fa09af48ebca46 WatchSource:0}: Error finding container 8dccc43b52b0e40a2228ec99bf7d10e4c493c2e44a62732e69fa09af48ebca46: Status 404 returned error can't find the container with id 8dccc43b52b0e40a2228ec99bf7d10e4c493c2e44a62732e69fa09af48ebca46 Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.041531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sdswt" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.057899 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kkb76" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.096082 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"21305c48c687092779b6780ad00dfe6270cfd0a2c3f1576e3987f5722d418963"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.098981 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eb771d1c76a3066210e01d9e1265db43599efe463d5c412406f265bd389eba67"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.102718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9617a45c75b75aab882bf163439180ef02e15b362f83ae7d26171713589a8560"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.103720 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"8dccc43b52b0e40a2228ec99bf7d10e4c493c2e44a62732e69fa09af48ebca46"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.105352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" event={"ID":"ae817cb5-b8a8-47c9-aa49-53a975c9329a","Type":"ContainerStarted","Data":"2a7696e76e8d1ec53fcc1dbf23b5cedc17938e55a036fb8bac4704eb809b24d5"} Mar 09 16:00:03 crc kubenswrapper[4831]: W0309 16:00:03.105868 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1de3c2a_8954_4286_aa94_b16d80cf28ad.slice/crio-d74c2a07fd092fea7ed6524eb60fa02972cce9bdd2e04b69e4a9620557f4a79b WatchSource:0}: Error finding container d74c2a07fd092fea7ed6524eb60fa02972cce9bdd2e04b69e4a9620557f4a79b: Status 404 returned error can't find the container with id d74c2a07fd092fea7ed6524eb60fa02972cce9bdd2e04b69e4a9620557f4a79b Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.106914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-46nwt" event={"ID":"b6f55879-dc86-45fb-8f15-9294bea295d7","Type":"ContainerStarted","Data":"e7a23073c4421ebfd662faa9f1bcfbf1cebf97ddfaa3a5e6dc94324eade4f59b"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.108157 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerStarted","Data":"1cefce70b7f23d5fcaa28f764ee733fde9ae12c53fe6fdb7065fd7cc6327e72f"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.109439 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"b1659cb6d435b1f14250f65dff28760dfec76d127b9f93b040e2404abe8dcd21"} Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.290272 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.290766 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.290742615 +0000 UTC m=+131.424425038 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.291261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.291302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.291334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.291371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291504 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291518 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291537 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291549 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291550 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.291541137 +0000 UTC m=+131.425223560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291626 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291604 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.291588819 +0000 UTC m=+131.425271292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291640 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291642 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291653 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291693 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.291682531 +0000 UTC m=+131.425365014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.291714 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.291705152 +0000 UTC m=+131.425387675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.392469 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.392582 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.392645 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:04.392627712 +0000 UTC m=+131.526310145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.621146 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.621719 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.623052 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.624468 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.627192 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.628492 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.630153 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.632696 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.632832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.634237 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.636652 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.638197 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.641049 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.642162 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.642757 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.643523 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.644199 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.644914 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.645699 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.646178 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.646813 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.647445 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.647941 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.650432 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.650912 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.651962 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.652421 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.653122 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.654433 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.654939 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.654921 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.656031 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.656686 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.657837 4831 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.657960 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.659764 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.660770 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.661223 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.663221 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.663970 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.664853 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.664976 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.665699 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.667086 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.667658 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.668754 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.669350 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.670759 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.671258 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.672275 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.673247 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.674016 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.674570 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.675678 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.676132 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.676495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.677054 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.677638 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.678086 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.689300 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.701167 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.712740 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: E0309 16:00:03.725189 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.728915 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.737997 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.745829 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.757188 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.765302 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:03 crc kubenswrapper[4831]: I0309 16:00:03.779704 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.114197 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07" exitCode=0 Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.114283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.114336 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerStarted","Data":"d74c2a07fd092fea7ed6524eb60fa02972cce9bdd2e04b69e4a9620557f4a79b"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.116276 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" exitCode=0 Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.116341 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.118217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kkb76" event={"ID":"df601c5a-632b-476d-aa81-12f31472e452","Type":"ContainerStarted","Data":"6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.118257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kkb76" event={"ID":"df601c5a-632b-476d-aa81-12f31472e452","Type":"ContainerStarted","Data":"d477dedd770343a6afde238be14b22c5795a606cf93f9a23d9e61dee29142ffc"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.119869 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-46nwt" event={"ID":"b6f55879-dc86-45fb-8f15-9294bea295d7","Type":"ContainerStarted","Data":"6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.122243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerStarted","Data":"f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.123836 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.126565 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.126602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.129849 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.129879 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.133449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" event={"ID":"ae817cb5-b8a8-47c9-aa49-53a975c9329a","Type":"ContainerStarted","Data":"b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.133497 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" event={"ID":"ae817cb5-b8a8-47c9-aa49-53a975c9329a","Type":"ContainerStarted","Data":"f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b"} Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.139053 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.151949 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.169709 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.183760 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.207013 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.220236 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.232669 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.247506 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.263605 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.287646 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.302470 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.304877 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.304977 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.305003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305064 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305093 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.305059671 +0000 UTC m=+133.438742124 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305132 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.305120663 +0000 UTC m=+133.438803296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305143 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305167 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305180 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.305207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305219 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.305209635 +0000 UTC m=+133.438892058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.305273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305309 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305342 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.305330388 +0000 UTC m=+133.439012811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305361 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305380 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305395 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.305447 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.305436351 +0000 UTC m=+133.439118814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.318222 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.331578 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.349144 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.361671 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.376499 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.392296 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.406311 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.406486 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.406548 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:06.406531612 +0000 UTC m=+133.540214035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.410758 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.427804 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.441295 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.459790 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.474988 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.490597 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.513297 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.524674 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.547728 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.560223 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.577793 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:04Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.616490 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.616510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.616547 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:04 crc kubenswrapper[4831]: I0309 16:00:04.616515 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.616665 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.616712 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.616772 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:04 crc kubenswrapper[4831]: E0309 16:00:04.616814 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.141194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.141959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.141973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.141983 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.141992 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.143723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerStarted","Data":"8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80"} Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.162103 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.198449 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.240487 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.258453 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.282261 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.294588 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.311903 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.325357 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.336665 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.347595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.360254 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.371371 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.383677 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:05 crc kubenswrapper[4831]: I0309 16:00:05.397599 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:05Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.153020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.154689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b"} Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.157148 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80" exitCode=0 Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.157171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80"} Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.174756 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.196309 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.209218 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.225630 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.237879 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.251257 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.262140 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.274041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.283890 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.303999 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.316082 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325012 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325255 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.325210366 +0000 UTC m=+137.458892819 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325365 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325371 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325470 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325527 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325541 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325542 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.325566 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325550 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325494 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325484 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325597 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.325582967 +0000 UTC m=+137.459265410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325813 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.325794063 +0000 UTC m=+137.459476486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325828 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.325821443 +0000 UTC m=+137.459503866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.325838 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.325833104 +0000 UTC m=+137.459515527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.334453 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.346731 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.357020 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.366633 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.384792 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.395999 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.406327 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.418477 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.426173 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.426345 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.426427 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:10.42639241 +0000 UTC m=+137.560074833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.430042 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.441346 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.452012 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.462629 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.473125 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.483056 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.494356 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.505095 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.616993 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.617207 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.617643 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.617727 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.617648 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.617795 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.617861 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.617958 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.668082 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.692049 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.692091 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.692038 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: E0309 16:00:06.692543 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.704799 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.720345 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.735807 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.750496 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.766458 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.785260 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.800313 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.819495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.835254 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.849478 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.865322 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.877874 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:06 crc kubenswrapper[4831]: I0309 16:00:06.900041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:06Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.163820 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947" exitCode=0 Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.163911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947"} Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.165084 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:07 crc kubenswrapper[4831]: E0309 16:00:07.165336 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.181061 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.210139 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.232929 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.254999 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.271312 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.287297 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.299611 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.320635 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.335856 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.349319 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.363366 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.374129 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.393947 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.405054 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:07 crc kubenswrapper[4831]: I0309 16:00:07.415011 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:07Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.171260 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263" exitCode=0 Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.171331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263"} Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.175889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.195559 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.210848 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.231670 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.247509 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.259552 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.275485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.288276 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.299214 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.310052 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.322077 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.333220 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.348583 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.365311 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.377955 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.391349 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:08Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.409255 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.409956 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.410157 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.617321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.617428 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.617360 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:08 crc kubenswrapper[4831]: I0309 16:00:08.617339 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.617625 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.617988 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.618068 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.618115 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:08 crc kubenswrapper[4831]: E0309 16:00:08.727017 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.183891 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b" exitCode=0 Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.183955 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b"} Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.200675 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.218198 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.232390 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.255684 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.275068 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.291619 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.304922 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.315285 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.328371 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.339917 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.358475 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.373039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.385267 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.396593 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:09 crc kubenswrapper[4831]: I0309 16:00:09.408540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:09Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.189856 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1de3c2a-8954-4286-aa94-b16d80cf28ad" containerID="6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b" exitCode=0 Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.190257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerDied","Data":"6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b"} Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.196315 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781"} Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.197134 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.197179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.204759 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.212391 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.224015 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.230785 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.233589 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.247242 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.258254 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.267428 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.277760 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.290125 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.302264 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.315698 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.330828 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.364026 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.371315 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.371516 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371545 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.371514836 +0000 UTC m=+145.505197269 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.371606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.371670 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371678 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371703 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371721 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.371732 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371756 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371787 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.371766753 +0000 UTC m=+145.505449206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371849 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.371827605 +0000 UTC m=+145.505510028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371867 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371887 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371904 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371919 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371974 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.371918018 +0000 UTC m=+145.505600501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.371998 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.37198924 +0000 UTC m=+145.505671773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.391684 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.418363 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.434120 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.448714 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.462608 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.473098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.473255 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.473320 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:18.473302306 +0000 UTC m=+145.606984729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.475520 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.485628 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.496734 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.508186 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.521079 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.536353 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.550354 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.563223 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.578269 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.590126 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.601325 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.616898 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.616940 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.616960 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.617028 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.616897 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.617103 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.617224 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:10 crc kubenswrapper[4831]: E0309 16:00:10.617316 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.631320 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.645167 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:10 crc kubenswrapper[4831]: I0309 16:00:10.659239 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:10Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.203958 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" event={"ID":"b1de3c2a-8954-4286-aa94-b16d80cf28ad","Type":"ContainerStarted","Data":"88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2"} Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.230993 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.267258 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.294952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.319118 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.332970 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.350057 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.378719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.393842 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.408719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.420805 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.437563 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.454387 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.470306 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.483850 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:11 crc kubenswrapper[4831]: I0309 16:00:11.495227 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:11Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.209205 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/0.log" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.212620 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781" exitCode=1 Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.212672 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781"} Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.213735 4831 scope.go:117] "RemoveContainer" containerID="b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.226828 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.249953 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.262879 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.273322 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.299169 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 16:00:12.089553 6844 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:12.089632 6844 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:12.089669 6844 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:12.089673 6844 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:12.089688 6844 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:12.089693 6844 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:12.089697 6844 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:12.089729 6844 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 16:00:12.089737 6844 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 16:00:12.089749 6844 factory.go:656] Stopping watch factory\\\\nI0309 16:00:12.089757 6844 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:12.089769 6844 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:12.089775 6844 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.313455 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.327492 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.345833 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.365293 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.382971 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.396934 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.412003 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.431896 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.461763 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.479331 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.505579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.505608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.505616 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.505628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.505637 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:12Z","lastTransitionTime":"2026-03-09T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.521867 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.526375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.526431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.526442 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.526457 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.526466 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:12Z","lastTransitionTime":"2026-03-09T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.542759 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.546841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.546895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.546910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.546931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.546945 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:12Z","lastTransitionTime":"2026-03-09T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.562496 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.566416 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.566453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.566466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.566488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.566506 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:12Z","lastTransitionTime":"2026-03-09T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.582984 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.586759 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.586813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.586831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.586856 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.586872 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:12Z","lastTransitionTime":"2026-03-09T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.601754 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:12Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.601910 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.617100 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.617162 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.617207 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:12 crc kubenswrapper[4831]: I0309 16:00:12.617230 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.617272 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.617434 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.617555 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:12 crc kubenswrapper[4831]: E0309 16:00:12.617621 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.218449 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/0.log" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.221003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea"} Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.221986 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.235005 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.245362 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.256163 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.264831 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.289741 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 16:00:12.089553 6844 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:12.089632 6844 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:12.089669 6844 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:12.089673 6844 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:12.089688 6844 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:12.089693 6844 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:12.089697 6844 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:12.089729 6844 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 16:00:12.089737 6844 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 16:00:12.089749 6844 factory.go:656] Stopping watch factory\\\\nI0309 16:00:12.089757 6844 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:12.089769 6844 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:12.089775 6844 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.301834 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.313235 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.321253 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.336873 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.350630 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.362298 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.374122 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.384564 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.401048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.419607 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.634038 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.648102 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.666673 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.683077 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.701028 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.717245 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: E0309 16:00:13.727611 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.734044 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.759473 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.774532 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.798278 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 16:00:12.089553 6844 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:12.089632 6844 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:12.089669 6844 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:12.089673 6844 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:12.089688 6844 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:12.089693 6844 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:12.089697 6844 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:12.089729 6844 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 16:00:12.089737 6844 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 16:00:12.089749 6844 factory.go:656] Stopping watch factory\\\\nI0309 16:00:12.089757 6844 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:12.089769 6844 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:12.089775 6844 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.815981 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.827798 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.838632 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.850772 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.865026 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:13 crc kubenswrapper[4831]: I0309 16:00:13.881650 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.229003 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/1.log" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.231341 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/0.log" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.237008 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea" exitCode=1 Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.237193 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea"} Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.237304 4831 scope.go:117] "RemoveContainer" containerID="b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.238603 4831 scope.go:117] "RemoveContainer" containerID="265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea" Mar 09 16:00:14 crc kubenswrapper[4831]: E0309 16:00:14.238887 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.264166 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.284065 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.298738 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.313652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.338757 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.362100 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.380064 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.397485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.415939 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.447550 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7a26a311df8262f283f522f691ded4ecc90ba64c1372d73038d81939ee97781\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:12Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 16:00:12.089553 6844 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:12.089632 6844 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:12.089669 6844 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:12.089673 6844 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:12.089688 6844 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:12.089693 6844 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:12.089697 6844 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:12.089729 6844 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 16:00:12.089737 6844 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 16:00:12.089749 6844 factory.go:656] Stopping watch factory\\\\nI0309 16:00:12.089757 6844 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:12.089769 6844 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:12.089763 6844 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:12.089775 6844 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:13Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 16:00:13.201507 6964 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:13.201544 6964 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:13.201581 6964 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:13.201586 6964 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:13.201624 6964 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:13.201650 6964 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:13.201693 6964 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 16:00:13.201716 6964 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 16:00:13.201705 6964 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:13.201757 6964 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 16:00:13.201715 6964 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:13.201790 6964 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:13.201830 6964 factory.go:656] Stopping watch factory\\\\nI0309 16:00:13.201857 6964 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:13.201864 6964 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.468559 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.493173 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.506299 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.522108 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.540681 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.557054 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:14Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.617323 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.617384 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.617386 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:14 crc kubenswrapper[4831]: E0309 16:00:14.617493 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:14 crc kubenswrapper[4831]: I0309 16:00:14.617513 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:14 crc kubenswrapper[4831]: E0309 16:00:14.617666 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:14 crc kubenswrapper[4831]: E0309 16:00:14.617922 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:14 crc kubenswrapper[4831]: E0309 16:00:14.617995 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.242273 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/1.log" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.246277 4831 scope.go:117] "RemoveContainer" containerID="265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea" Mar 09 16:00:15 crc kubenswrapper[4831]: E0309 16:00:15.246486 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.261785 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.279096 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.292779 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.307759 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.323153 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.340175 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.356574 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.371378 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.383621 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.401160 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.416692 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.432037 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.444062 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.460513 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.473706 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:15 crc kubenswrapper[4831]: I0309 16:00:15.504385 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:13Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 16:00:13.201507 6964 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:13.201544 6964 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:13.201581 6964 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:13.201586 6964 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:13.201624 6964 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:13.201650 6964 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:13.201693 6964 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 16:00:13.201716 6964 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 16:00:13.201705 6964 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:13.201757 6964 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 16:00:13.201715 6964 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:13.201790 6964 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:13.201830 6964 factory.go:656] Stopping watch factory\\\\nI0309 16:00:13.201857 6964 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:13.201864 6964 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:15Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:16 crc kubenswrapper[4831]: I0309 16:00:16.616555 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:16 crc kubenswrapper[4831]: I0309 16:00:16.616563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:16 crc kubenswrapper[4831]: I0309 16:00:16.616592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:16 crc kubenswrapper[4831]: I0309 16:00:16.616669 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:16 crc kubenswrapper[4831]: E0309 16:00:16.617815 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:16 crc kubenswrapper[4831]: E0309 16:00:16.617908 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:16 crc kubenswrapper[4831]: E0309 16:00:16.618085 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:16 crc kubenswrapper[4831]: E0309 16:00:16.618164 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.459469 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.459660 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.459634272 +0000 UTC m=+161.593316695 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.459978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.460026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.460054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.460097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460189 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460234 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460249 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460260 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460256 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460268 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460319 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460287 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.46025688 +0000 UTC m=+161.593939353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460350 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460375 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.460348732 +0000 UTC m=+161.594031195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460454 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.460387033 +0000 UTC m=+161.594069496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.460504 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.460481596 +0000 UTC m=+161.594164059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.561661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.561907 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.561989 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:00:34.561965917 +0000 UTC m=+161.695648360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.616700 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.616733 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.616738 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.616837 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:18 crc kubenswrapper[4831]: I0309 16:00:18.616866 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.616964 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.617065 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.617201 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:18 crc kubenswrapper[4831]: E0309 16:00:18.729452 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:20 crc kubenswrapper[4831]: I0309 16:00:20.616696 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:20 crc kubenswrapper[4831]: I0309 16:00:20.616855 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:20 crc kubenswrapper[4831]: I0309 16:00:20.616925 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:20 crc kubenswrapper[4831]: E0309 16:00:20.616930 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:20 crc kubenswrapper[4831]: E0309 16:00:20.617089 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:20 crc kubenswrapper[4831]: I0309 16:00:20.616739 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:20 crc kubenswrapper[4831]: E0309 16:00:20.617224 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:20 crc kubenswrapper[4831]: E0309 16:00:20.617457 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.617344 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.617376 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.617491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.617537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.617494 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.617727 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.617905 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.618459 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.618828 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.619176 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.631990 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.825820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.825852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.825860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.825873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.825882 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:22Z","lastTransitionTime":"2026-03-09T16:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.841811 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:22Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.846520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.846555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.846565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.846577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.846586 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:22Z","lastTransitionTime":"2026-03-09T16:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.865889 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:22Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.870810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.870851 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.870865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.870881 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.870891 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:22Z","lastTransitionTime":"2026-03-09T16:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.883175 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:22Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.886293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.886331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.886345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.886363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.886378 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:22Z","lastTransitionTime":"2026-03-09T16:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.901922 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:22Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.905432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.905468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.905482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.905506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:22 crc kubenswrapper[4831]: I0309 16:00:22.905522 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:22Z","lastTransitionTime":"2026-03-09T16:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.921575 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:22Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:22 crc kubenswrapper[4831]: E0309 16:00:22.921740 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.635581 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.649577 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.663826 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.683137 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.694800 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.718219 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:13Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 16:00:13.201507 6964 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:13.201544 6964 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:13.201581 6964 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:13.201586 6964 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:13.201624 6964 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:13.201650 6964 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:13.201693 6964 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 16:00:13.201716 6964 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 16:00:13.201705 6964 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:13.201757 6964 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 16:00:13.201715 6964 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:13.201790 6964 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:13.201830 6964 factory.go:656] Stopping watch factory\\\\nI0309 16:00:13.201857 6964 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:13.201864 6964 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: E0309 16:00:23.730059 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.736803 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.748810 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.761950 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.772589 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.787804 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.801961 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.816895 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.832021 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.843415 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.854470 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:23 crc kubenswrapper[4831]: I0309 16:00:23.866123 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:23Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:24 crc kubenswrapper[4831]: I0309 16:00:24.617366 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:24 crc kubenswrapper[4831]: I0309 16:00:24.617366 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:24 crc kubenswrapper[4831]: I0309 16:00:24.617505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:24 crc kubenswrapper[4831]: I0309 16:00:24.617598 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:24 crc kubenswrapper[4831]: E0309 16:00:24.617750 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:24 crc kubenswrapper[4831]: E0309 16:00:24.617930 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:24 crc kubenswrapper[4831]: E0309 16:00:24.618081 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:24 crc kubenswrapper[4831]: E0309 16:00:24.618167 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:26 crc kubenswrapper[4831]: I0309 16:00:26.617187 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:26 crc kubenswrapper[4831]: I0309 16:00:26.617187 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:26 crc kubenswrapper[4831]: E0309 16:00:26.617327 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:26 crc kubenswrapper[4831]: I0309 16:00:26.617461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:26 crc kubenswrapper[4831]: E0309 16:00:26.617491 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:26 crc kubenswrapper[4831]: E0309 16:00:26.617534 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:26 crc kubenswrapper[4831]: I0309 16:00:26.617870 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:26 crc kubenswrapper[4831]: E0309 16:00:26.617964 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.617200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.617200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.617304 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.617647 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:28 crc kubenswrapper[4831]: E0309 16:00:28.617814 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:28 crc kubenswrapper[4831]: E0309 16:00:28.617976 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:28 crc kubenswrapper[4831]: E0309 16:00:28.618104 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:28 crc kubenswrapper[4831]: E0309 16:00:28.618274 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.619023 4831 scope.go:117] "RemoveContainer" containerID="265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea" Mar 09 16:00:28 crc kubenswrapper[4831]: I0309 16:00:28.641337 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 16:00:28 crc kubenswrapper[4831]: E0309 16:00:28.731151 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.294674 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/1.log" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.297803 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c"} Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.298421 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.311021 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.325578 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.339006 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.359649 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.373500 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.388634 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.402888 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.414194 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.425881 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.448312 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:13Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 16:00:13.201507 6964 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:13.201544 6964 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:13.201581 6964 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:13.201586 6964 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:13.201624 6964 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:13.201650 6964 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:13.201693 6964 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 16:00:13.201716 6964 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 16:00:13.201705 6964 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:13.201757 6964 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 16:00:13.201715 6964 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:13.201790 6964 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:13.201830 6964 factory.go:656] Stopping watch factory\\\\nI0309 16:00:13.201857 6964 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:13.201864 6964 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.460652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.474511 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.492610 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.503333 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.514015 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.526972 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.544120 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:29 crc kubenswrapper[4831]: I0309 16:00:29.554083 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.305316 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/2.log" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.306514 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/1.log" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.310347 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" exitCode=1 Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.310448 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c"} Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.310506 4831 scope.go:117] "RemoveContainer" containerID="265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.311532 4831 scope.go:117] "RemoveContainer" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" Mar 09 16:00:30 crc kubenswrapper[4831]: E0309 16:00:30.311831 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.331796 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.346658 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.363439 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.380652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.390521 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.409296 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://265bb7f7ed95e45cb9c392beec6230b96a883ad68f755faa4ac8bbe2c77064ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:13Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 16:00:13.201507 6964 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 16:00:13.201544 6964 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 16:00:13.201581 6964 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 16:00:13.201586 6964 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 16:00:13.201624 6964 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 16:00:13.201650 6964 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 16:00:13.201693 6964 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 16:00:13.201716 6964 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 16:00:13.201705 6964 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 16:00:13.201757 6964 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 16:00:13.201715 6964 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 16:00:13.201790 6964 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 16:00:13.201830 6964 factory.go:656] Stopping watch factory\\\\nI0309 16:00:13.201857 6964 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 16:00:13.201864 6964 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.422113 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.433697 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.459753 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.473338 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.485545 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.500712 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.513138 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.524069 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.539080 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.551650 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.564578 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.575311 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:30Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.616752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:30 crc kubenswrapper[4831]: E0309 16:00:30.616877 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.616767 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.616771 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:30 crc kubenswrapper[4831]: I0309 16:00:30.616762 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:30 crc kubenswrapper[4831]: E0309 16:00:30.617042 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:30 crc kubenswrapper[4831]: E0309 16:00:30.617122 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:30 crc kubenswrapper[4831]: E0309 16:00:30.616954 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.317850 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/2.log" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.322907 4831 scope.go:117] "RemoveContainer" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" Mar 09 16:00:31 crc kubenswrapper[4831]: E0309 16:00:31.323275 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.342278 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.359336 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.376666 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.401144 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.418358 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.438737 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.460199 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.488993 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.504864 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.517763 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.538881 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.554979 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.571198 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.586540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.604347 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.618306 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.637447 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:31 crc kubenswrapper[4831]: I0309 16:00:31.655734 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:31Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.617257 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.617320 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.617356 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.617488 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.617527 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.617727 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.617948 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.618125 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.941732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.941785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.941797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.941817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.941831 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:32Z","lastTransitionTime":"2026-03-09T16:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.958319 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:32Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.962019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.962060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.962069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.962083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.962093 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:32Z","lastTransitionTime":"2026-03-09T16:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.975643 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:32Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.979689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.979727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.979747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.979768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.979786 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:32Z","lastTransitionTime":"2026-03-09T16:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:32 crc kubenswrapper[4831]: E0309 16:00:32.995285 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:32Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.999801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.999847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.999862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.999884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:32 crc kubenswrapper[4831]: I0309 16:00:32.999900 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:32Z","lastTransitionTime":"2026-03-09T16:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:33 crc kubenswrapper[4831]: E0309 16:00:33.018760 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.022054 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.022096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.022104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.022118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.022129 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:33Z","lastTransitionTime":"2026-03-09T16:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:33 crc kubenswrapper[4831]: E0309 16:00:33.042619 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: E0309 16:00:33.042740 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.618391 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:33 crc kubenswrapper[4831]: E0309 16:00:33.618573 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.633707 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.648542 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.664270 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.675135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.685729 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.708304 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.724208 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: E0309 16:00:33.732119 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.741137 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.751873 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.781754 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.800684 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.812030 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.821467 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.833341 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.844708 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.857957 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.874767 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:33 crc kubenswrapper[4831]: I0309 16:00:33.889041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.521221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.521471 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.521443933 +0000 UTC m=+193.655126356 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.521796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.521843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.521883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.521937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522029 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522078 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522106 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522124 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522121 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522037 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522139 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.522110941 +0000 UTC m=+193.655793374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522192 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522205 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522219 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.522197044 +0000 UTC m=+193.655879507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522246 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.522229345 +0000 UTC m=+193.655911808 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.522268 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.522257765 +0000 UTC m=+193.655940218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.617521 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.617578 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.617592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.617527 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.617761 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.617834 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.618003 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.618166 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:34 crc kubenswrapper[4831]: I0309 16:00:34.623309 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.623505 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:34 crc kubenswrapper[4831]: E0309 16:00:34.623614 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:01:06.623583832 +0000 UTC m=+193.757266295 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:00:36 crc kubenswrapper[4831]: I0309 16:00:36.616965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:36 crc kubenswrapper[4831]: I0309 16:00:36.617045 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:36 crc kubenswrapper[4831]: I0309 16:00:36.617097 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:36 crc kubenswrapper[4831]: I0309 16:00:36.616965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:36 crc kubenswrapper[4831]: E0309 16:00:36.617181 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:36 crc kubenswrapper[4831]: E0309 16:00:36.617439 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:36 crc kubenswrapper[4831]: E0309 16:00:36.617506 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:36 crc kubenswrapper[4831]: E0309 16:00:36.617642 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:38 crc kubenswrapper[4831]: I0309 16:00:38.616487 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:38 crc kubenswrapper[4831]: E0309 16:00:38.616725 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:38 crc kubenswrapper[4831]: I0309 16:00:38.616779 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:38 crc kubenswrapper[4831]: I0309 16:00:38.616831 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:38 crc kubenswrapper[4831]: E0309 16:00:38.616931 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:38 crc kubenswrapper[4831]: I0309 16:00:38.616781 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:38 crc kubenswrapper[4831]: E0309 16:00:38.617150 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:38 crc kubenswrapper[4831]: E0309 16:00:38.617476 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:38 crc kubenswrapper[4831]: E0309 16:00:38.733811 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:40 crc kubenswrapper[4831]: I0309 16:00:40.616354 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:40 crc kubenswrapper[4831]: I0309 16:00:40.616426 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:40 crc kubenswrapper[4831]: E0309 16:00:40.616519 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:40 crc kubenswrapper[4831]: I0309 16:00:40.616576 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:40 crc kubenswrapper[4831]: I0309 16:00:40.616574 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:40 crc kubenswrapper[4831]: E0309 16:00:40.616670 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:40 crc kubenswrapper[4831]: E0309 16:00:40.616758 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:40 crc kubenswrapper[4831]: E0309 16:00:40.616801 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:42 crc kubenswrapper[4831]: I0309 16:00:42.617167 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:42 crc kubenswrapper[4831]: I0309 16:00:42.617271 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:42 crc kubenswrapper[4831]: I0309 16:00:42.617274 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:42 crc kubenswrapper[4831]: I0309 16:00:42.617200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:42 crc kubenswrapper[4831]: E0309 16:00:42.617475 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:42 crc kubenswrapper[4831]: E0309 16:00:42.617584 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:42 crc kubenswrapper[4831]: E0309 16:00:42.617722 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:42 crc kubenswrapper[4831]: E0309 16:00:42.617885 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.291549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.291647 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.291675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.291708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.291813 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:43Z","lastTransitionTime":"2026-03-09T16:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.312194 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.317158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.317203 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.317219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.317239 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.317255 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:43Z","lastTransitionTime":"2026-03-09T16:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.333855 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.339527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.339609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.339636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.339667 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.339691 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:43Z","lastTransitionTime":"2026-03-09T16:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.359804 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.363902 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.363968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.363982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.363997 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.364009 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:43Z","lastTransitionTime":"2026-03-09T16:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.375921 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.379808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.379864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.379881 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.379904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.379920 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:43Z","lastTransitionTime":"2026-03-09T16:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.409179 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.409340 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.635174 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.649476 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.664532 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.677998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.695715 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.711655 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: E0309 16:00:43.734642 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.736997 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.759052 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.779820 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.798311 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.817918 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.837803 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.850590 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.867321 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.878204 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.890950 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.921009 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:43 crc kubenswrapper[4831]: I0309 16:00:43.941303 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:43Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:44 crc kubenswrapper[4831]: I0309 16:00:44.616831 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:44 crc kubenswrapper[4831]: E0309 16:00:44.617283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:44 crc kubenswrapper[4831]: I0309 16:00:44.617327 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:44 crc kubenswrapper[4831]: I0309 16:00:44.617349 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:44 crc kubenswrapper[4831]: I0309 16:00:44.617361 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:44 crc kubenswrapper[4831]: E0309 16:00:44.617712 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:44 crc kubenswrapper[4831]: E0309 16:00:44.617868 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:44 crc kubenswrapper[4831]: E0309 16:00:44.617973 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:44 crc kubenswrapper[4831]: I0309 16:00:44.618011 4831 scope.go:117] "RemoveContainer" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" Mar 09 16:00:44 crc kubenswrapper[4831]: E0309 16:00:44.618160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:45 crc kubenswrapper[4831]: I0309 16:00:45.642763 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 16:00:46 crc kubenswrapper[4831]: I0309 16:00:46.617191 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:46 crc kubenswrapper[4831]: I0309 16:00:46.617267 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:46 crc kubenswrapper[4831]: I0309 16:00:46.617202 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:46 crc kubenswrapper[4831]: E0309 16:00:46.617356 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:46 crc kubenswrapper[4831]: E0309 16:00:46.617643 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:46 crc kubenswrapper[4831]: E0309 16:00:46.617737 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:46 crc kubenswrapper[4831]: I0309 16:00:46.617203 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:46 crc kubenswrapper[4831]: E0309 16:00:46.617867 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:46 crc kubenswrapper[4831]: I0309 16:00:46.618325 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:00:46 crc kubenswrapper[4831]: E0309 16:00:46.618479 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:00:48 crc kubenswrapper[4831]: I0309 16:00:48.616357 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:48 crc kubenswrapper[4831]: I0309 16:00:48.616421 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:48 crc kubenswrapper[4831]: E0309 16:00:48.616546 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:48 crc kubenswrapper[4831]: I0309 16:00:48.616349 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:48 crc kubenswrapper[4831]: I0309 16:00:48.616584 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:48 crc kubenswrapper[4831]: E0309 16:00:48.616645 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:48 crc kubenswrapper[4831]: E0309 16:00:48.616863 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:48 crc kubenswrapper[4831]: E0309 16:00:48.616927 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:48 crc kubenswrapper[4831]: E0309 16:00:48.736540 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.461506 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/0.log" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.461560 4831 generic.go:334] "Generic (PLEG): container finished" podID="c53277d4-7695-47e5-bacc-e6ab6dca1501" containerID="f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b" exitCode=1 Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.461593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerDied","Data":"f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b"} Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.461969 4831 scope.go:117] "RemoveContainer" containerID="f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.481479 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.498172 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.519251 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.532542 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.547808 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.562888 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.578545 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.597429 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.610881 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.616691 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.616763 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.616712 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:50 crc kubenswrapper[4831]: E0309 16:00:50.616852 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.616712 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:50 crc kubenswrapper[4831]: E0309 16:00:50.616928 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:50 crc kubenswrapper[4831]: E0309 16:00:50.617041 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:50 crc kubenswrapper[4831]: E0309 16:00:50.617149 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.621770 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.634465 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.657853 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.673196 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.691655 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.705429 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.722040 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.735946 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.748413 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:50 crc kubenswrapper[4831]: I0309 16:00:50.759876 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:50Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.466749 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/0.log" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.466824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerStarted","Data":"8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6"} Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.485206 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.499711 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.517540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.539183 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.560216 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.578176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.597119 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.610442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.648003 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.672349 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.688048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.700302 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.721567 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.735998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.749644 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.768444 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.784913 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.811384 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:51 crc kubenswrapper[4831]: I0309 16:00:51.830805 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:51Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:52 crc kubenswrapper[4831]: I0309 16:00:52.616533 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:52 crc kubenswrapper[4831]: I0309 16:00:52.616603 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:52 crc kubenswrapper[4831]: I0309 16:00:52.616550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:52 crc kubenswrapper[4831]: E0309 16:00:52.616744 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:52 crc kubenswrapper[4831]: I0309 16:00:52.616852 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:52 crc kubenswrapper[4831]: E0309 16:00:52.616994 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:52 crc kubenswrapper[4831]: E0309 16:00:52.617059 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:52 crc kubenswrapper[4831]: E0309 16:00:52.617149 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.448018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.448067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.448080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.448101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.448113 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:53Z","lastTransitionTime":"2026-03-09T16:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.465498 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.470473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.470538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.470555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.470582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.470600 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:53Z","lastTransitionTime":"2026-03-09T16:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.483931 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.487916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.487944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.487953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.487968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.487977 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:53Z","lastTransitionTime":"2026-03-09T16:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.499674 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.503390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.503454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.503468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.503485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.503498 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:53Z","lastTransitionTime":"2026-03-09T16:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.516890 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.520768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.520816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.520825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.520849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.520860 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:00:53Z","lastTransitionTime":"2026-03-09T16:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.539206 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.539378 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.637139 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.649495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.670088 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.683388 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.694683 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.715774 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: E0309 16:00:53.737014 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.737589 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.753552 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.767704 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.782774 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.797583 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.809723 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.830737 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.842235 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.853914 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.866273 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.878920 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.891795 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:53 crc kubenswrapper[4831]: I0309 16:00:53.907330 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:53Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:54 crc kubenswrapper[4831]: I0309 16:00:54.617373 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:54 crc kubenswrapper[4831]: I0309 16:00:54.617434 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:54 crc kubenswrapper[4831]: I0309 16:00:54.617371 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:54 crc kubenswrapper[4831]: E0309 16:00:54.617558 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:54 crc kubenswrapper[4831]: E0309 16:00:54.617756 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:54 crc kubenswrapper[4831]: I0309 16:00:54.617783 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:54 crc kubenswrapper[4831]: E0309 16:00:54.617994 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:54 crc kubenswrapper[4831]: E0309 16:00:54.618030 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:55 crc kubenswrapper[4831]: I0309 16:00:55.617996 4831 scope.go:117] "RemoveContainer" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.486249 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/2.log" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.490356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.491112 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.513193 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.530200 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.548668 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.564029 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.587729 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.607016 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.616638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.616685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.616660 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.616638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:56 crc kubenswrapper[4831]: E0309 16:00:56.616918 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:56 crc kubenswrapper[4831]: E0309 16:00:56.617019 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:56 crc kubenswrapper[4831]: E0309 16:00:56.617142 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:56 crc kubenswrapper[4831]: E0309 16:00:56.617251 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.632309 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.647360 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.661293 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.687536 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.702033 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.713470 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.725480 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.753868 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.768229 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.779727 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.800101 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.820258 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:56 crc kubenswrapper[4831]: I0309 16:00:56.836776 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.498204 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.499593 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/2.log" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.504269 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" exitCode=1 Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.504364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.504678 4831 scope.go:117] "RemoveContainer" containerID="274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.505520 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:00:57 crc kubenswrapper[4831]: E0309 16:00:57.505872 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.528438 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.559058 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.584198 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.605830 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.626244 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.640360 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.656676 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.668113 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.689490 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274d3faac535b8cdb1070a5708522bb5747c46c75ee3c670c2894e53df58ef5c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:29Z\\\",\\\"message\\\":\\\"-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0309 16:00:29.459205 7172 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:29Z is after 2025-08-24T17:21:41Z]\\\\nI0309 16:00:29.459190 7172 services_controller.go:451] Built service op\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:56Z\\\",\\\"message\\\":\\\"etwork-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0309 16:00:56.598827 7493 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0309 16:00:56.599098 7493 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.705542 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.718868 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.731220 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.758716 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.776766 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.793304 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.816643 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.831523 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.845803 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:57 crc kubenswrapper[4831]: I0309 16:00:57.860442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.511252 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.516736 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.517020 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.534121 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.580384 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:56Z\\\",\\\"message\\\":\\\"etwork-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0309 16:00:56.598827 7493 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0309 16:00:56.599098 7493 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.600962 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.616544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.616588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.616643 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.616690 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.616721 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.616850 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.616885 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.616937 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.621253 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.636266 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.657048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.671870 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.684916 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.708093 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.729148 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: E0309 16:00:58.739054 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.742695 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.757831 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.773290 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.789031 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.806786 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.821145 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.837050 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.850537 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:00:58 crc kubenswrapper[4831]: I0309 16:00:58.864911 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:58Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:00 crc kubenswrapper[4831]: I0309 16:01:00.616351 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:00 crc kubenswrapper[4831]: I0309 16:01:00.616425 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:00 crc kubenswrapper[4831]: I0309 16:01:00.616475 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:00 crc kubenswrapper[4831]: E0309 16:01:00.616623 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:00 crc kubenswrapper[4831]: I0309 16:01:00.616689 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:00 crc kubenswrapper[4831]: E0309 16:01:00.616792 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:00 crc kubenswrapper[4831]: E0309 16:01:00.616862 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:00 crc kubenswrapper[4831]: E0309 16:01:00.616943 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:00 crc kubenswrapper[4831]: I0309 16:01:00.617592 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:01:00 crc kubenswrapper[4831]: E0309 16:01:00.617798 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:01:02 crc kubenswrapper[4831]: I0309 16:01:02.616636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:02 crc kubenswrapper[4831]: I0309 16:01:02.616656 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:02 crc kubenswrapper[4831]: I0309 16:01:02.616852 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:02 crc kubenswrapper[4831]: E0309 16:01:02.616791 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:02 crc kubenswrapper[4831]: E0309 16:01:02.617189 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:02 crc kubenswrapper[4831]: E0309 16:01:02.617267 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:02 crc kubenswrapper[4831]: I0309 16:01:02.617619 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:02 crc kubenswrapper[4831]: E0309 16:01:02.617901 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.577267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.577331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.577343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.577364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.577375 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:03Z","lastTransitionTime":"2026-03-09T16:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.601422 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.606659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.606727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.606744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.606769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.606790 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:03Z","lastTransitionTime":"2026-03-09T16:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.628244 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.633452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.633506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.633519 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.633537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.633552 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:03Z","lastTransitionTime":"2026-03-09T16:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.642292 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.652273 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.657571 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.657629 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.657649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.657674 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.657692 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:03Z","lastTransitionTime":"2026-03-09T16:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.664559 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.675892 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.680989 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.681065 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.681090 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.681121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.681146 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:03Z","lastTransitionTime":"2026-03-09T16:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.685979 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.700918 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.701070 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.707032 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.723789 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: E0309 16:01:03.739722 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.758562 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:56Z\\\",\\\"message\\\":\\\"etwork-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0309 16:00:56.598827 7493 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0309 16:00:56.599098 7493 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.778518 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.792435 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.805111 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.827107 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.845913 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.859326 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.873087 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.884321 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.896128 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.908862 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.919353 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.928898 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:03 crc kubenswrapper[4831]: I0309 16:01:03.942890 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:04 crc kubenswrapper[4831]: I0309 16:01:04.616902 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:04 crc kubenswrapper[4831]: I0309 16:01:04.616954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:04 crc kubenswrapper[4831]: I0309 16:01:04.616954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:04 crc kubenswrapper[4831]: I0309 16:01:04.617150 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:04 crc kubenswrapper[4831]: E0309 16:01:04.617170 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:04 crc kubenswrapper[4831]: E0309 16:01:04.617335 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:04 crc kubenswrapper[4831]: E0309 16:01:04.617509 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:04 crc kubenswrapper[4831]: E0309 16:01:04.617624 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.582918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583074 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.583047797 +0000 UTC m=+257.716730220 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.583353 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.583424 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.583460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.583492 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583606 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583621 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583643 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583681 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583694 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583651 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583750 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583618 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583664 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.583645424 +0000 UTC m=+257.717327847 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583832 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.583811878 +0000 UTC m=+257.717494371 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583847 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.583837669 +0000 UTC m=+257.717520192 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.583874 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.58386503 +0000 UTC m=+257.717547543 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.617147 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.617225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.617225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.617296 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.617438 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.617509 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.617596 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.617738 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:06 crc kubenswrapper[4831]: I0309 16:01:06.684091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.684236 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:01:06 crc kubenswrapper[4831]: E0309 16:01:06.684345 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs podName:bdf8f784-8094-4b1c-96bb-f7997430a0ea nodeName:}" failed. No retries permitted until 2026-03-09 16:02:10.684318523 +0000 UTC m=+257.818000976 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs") pod "network-metrics-daemon-2597x" (UID: "bdf8f784-8094-4b1c-96bb-f7997430a0ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 16:01:08 crc kubenswrapper[4831]: I0309 16:01:08.616744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:08 crc kubenswrapper[4831]: I0309 16:01:08.616810 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:08 crc kubenswrapper[4831]: E0309 16:01:08.616868 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:08 crc kubenswrapper[4831]: I0309 16:01:08.616758 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:08 crc kubenswrapper[4831]: I0309 16:01:08.617012 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:08 crc kubenswrapper[4831]: E0309 16:01:08.617090 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:08 crc kubenswrapper[4831]: E0309 16:01:08.617220 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:08 crc kubenswrapper[4831]: E0309 16:01:08.617337 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:08 crc kubenswrapper[4831]: E0309 16:01:08.741859 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:10 crc kubenswrapper[4831]: I0309 16:01:10.617321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:10 crc kubenswrapper[4831]: I0309 16:01:10.617429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:10 crc kubenswrapper[4831]: I0309 16:01:10.617354 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:10 crc kubenswrapper[4831]: I0309 16:01:10.617395 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:10 crc kubenswrapper[4831]: E0309 16:01:10.617599 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:10 crc kubenswrapper[4831]: E0309 16:01:10.617735 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:10 crc kubenswrapper[4831]: E0309 16:01:10.617851 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:10 crc kubenswrapper[4831]: E0309 16:01:10.617933 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:12 crc kubenswrapper[4831]: I0309 16:01:12.617198 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:12 crc kubenswrapper[4831]: I0309 16:01:12.617474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:12 crc kubenswrapper[4831]: E0309 16:01:12.617668 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:12 crc kubenswrapper[4831]: I0309 16:01:12.618040 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:12 crc kubenswrapper[4831]: I0309 16:01:12.618081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:12 crc kubenswrapper[4831]: E0309 16:01:12.618146 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:12 crc kubenswrapper[4831]: E0309 16:01:12.618307 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:12 crc kubenswrapper[4831]: E0309 16:01:12.618849 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:12 crc kubenswrapper[4831]: I0309 16:01:12.619344 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:01:12 crc kubenswrapper[4831]: E0309 16:01:12.619732 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.618933 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.619223 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.635135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ea8a88-d81e-4763-99cb-2612d0d460e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a3cf35d1b2757c1e1aa909ae5028182f8df005ac94c5e617d08c9d4bf25044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603d08db38adc4d38643fe7d8c11575020a3077a796692bb4253f1230631c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d0dc044ad3aa7991a0e0cb6d3d271532aebcf912b9d137a6d0915fd1d68b6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f583ad77e22455fdf0ebd0a2fb8fd902a5a956f1712ac602f820d8d6fd8f10f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.650721 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.666876 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.680351 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddadf6feffdd0b8dc92eac8280dbef54dbc0b991c6a96f0c0f6410041b3b9880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a210ad6ebd3678eb738930ba6e39af8a406902a768e0cacdd47b4a789d1bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.694421 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8413c5dab0145b76c6979ad37650717e40808aa4c65ae26b43df125f35fa1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.708056 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9c746" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c53277d4-7695-47e5-bacc-e6ab6dca1501\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:49Z\\\",\\\"message\\\":\\\"2026-03-09T16:00:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f\\\\n2026-03-09T16:00:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d209b2a-f633-44a5-a1e5-f99c4514633f to /host/opt/cni/bin/\\\\n2026-03-09T16:00:04Z [verbose] multus-daemon started\\\\n2026-03-09T16:00:04Z [verbose] Readiness Indicator file check\\\\n2026-03-09T16:00:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nptbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9c746\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.724873 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sdswt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1de3c2a-8954-4286-aa94-b16d80cf28ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88d8f81a2537ebfb9d4807366e5f29fc9ca4ead9ada54975755349228ba212f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21911a5d94f4b37edbcfb83f820327c028eb814bbea7b5edaf01103b109a07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e24c268b05b7dc4b0044fe34f41590a711c92b60f6e6c54c719a7719b401c80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895e49606369b54d8bafb33e350b2cbdde31fee42a14bdec2ad4d8358e30a947\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a0f0224aa634a6d2ec32d30aa2babaa7ee6d4f68fa9b50c5882cee33f29263\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f381b2f5e7083057905cebd6345a76c7a08bf43fb214778baf69bf2fe68f500b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef7bb6af911a7e887db767d9f5ff5deec0bd94add90032ca018f5210754d55b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4msk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sdswt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.740039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae817cb5-b8a8-47c9-aa49-53a975c9329a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f448c64d2d76c48413d8cb69f3fbcd4a96fc8019e0c447d9f587f7693e95050b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b527f690235e00facabc86acba91092c3599f5e7578eda3903049279bb3f774c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ghhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zq8r6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.742327 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.751178 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kkb76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df601c5a-632b-476d-aa81-12f31472e452\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fa053549dbfa0b9fab3e44094579caf4ab431ee5f67ee99ff46dd2e098c944a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kkb76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.763819 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c73e955-28e4-4f5b-b376-76bd99d55020\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://858ddf0de5bdb96e564dc9b934f6b4728afb389db01f36c919ff9db14f0a84dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e787ec0f8c997051474917fa8f9bc45f018ebb121db7d3da07d53bf925d9cfc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.786905 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e08e05-f986-46db-bede-e739cd66b542\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c63dee16346a5985f0885d9e06517647046b2dc4bec9809a3a1216eb29d291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6ec8cb45b5413946981375eb4d988b599cc689b4a8d703314a1784d77b4bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://321230d8b6f1d5866be3c9846661ae9930b27e1b83b4c26a2dcbc99ea7decf48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce67587fd9aded345acb419c039f34a7fbe4c3dbe149a67d651b76d7822d129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d3a9de3b1cd73517a9dcd2268c2e6497452be67a9e8898c25d148f808e4a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8343ed14e862dc6a975579946c76ea684a7d2dbdad3b205d193b05b0991e2dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7388a4423d7961596694b21b5385b7acbd603e6f97994732144d8790920b20fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a0a4fe6e16a17d335a65b25fdf5b4048635a94b89d5190790364a5e7354db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.801652 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d1e9253-691d-4496-8705-2b99d82a79f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9b5d3dafe27acc32ec59bc6c09c7e6b5e8c27e33504396f055ec60d4207915f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8682cc6a76edc5f53ff0a63935edb6f6f922e061615c3f6b6814651bb042157f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:58:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 15:58:21.228813 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 15:58:21.230378 1 observer_polling.go:159] Starting file observer\\\\nI0309 15:58:21.232143 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 15:58:21.233315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 15:58:50.748494 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 15:58:50.748570 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:58:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad49b1b45de1326c4a333246e9597d81b24d089b6298c878396f33659d84f87e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffb62a64b0fecfa1a5de5d42f8ed95c6f46da0ed8cec3bcb66064a157c0d262\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.818540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.833103 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46nwt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6f55879-dc86-45fb-8f15-9294bea295d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dc308ed29eb13dc39c1c322d79a26004983f0e24e559975e37919c1bdf9ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g99g6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46nwt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.853999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.854095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.854107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.854129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.854143 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:13Z","lastTransitionTime":"2026-03-09T16:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.863696 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"498bff7b-8be5-4e87-8717-0de7f7a8b877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T16:00:56Z\\\",\\\"message\\\":\\\"etwork-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0309 16:00:56.598827 7493 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0309 16:00:56.599098 7493 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:00:56Z is after \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T16:00:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xp564\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7jxjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.872442 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.876841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.876876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.876887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.876912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.876926 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:13Z","lastTransitionTime":"2026-03-09T16:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.884370 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241ae37c-298d-408e-85c3-b88a569b096c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T15:57:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T15:59:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 15:59:58.501876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 15:59:58.502090 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 15:59:58.503202 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3511582648/tls.crt::/tmp/serving-cert-3511582648/tls.key\\\\\\\"\\\\nI0309 15:59:59.058890 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 15:59:59.061032 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 15:59:59.061052 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 15:59:59.061078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 15:59:59.061085 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 15:59:59.065531 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 15:59:59.065641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0309 15:59:59.065549 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 15:59:59.065695 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 15:59:59.065808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 15:59:59.065855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 15:59:59.065901 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 15:59:59.065946 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 15:59:59.067417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T15:59:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T15:57:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T15:57:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T15:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T15:57:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.891652 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.895723 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.895756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.895769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.895787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.895799 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:13Z","lastTransitionTime":"2026-03-09T16:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.899122 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30fc8c9c12a822935755416bd07315e44a2ed83ee120446c6d0e168c707f4b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.908933 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.909698 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2597x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdf8f784-8094-4b1c-96bb-f7997430a0ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqwpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2597x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.914141 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.914181 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.914191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.914211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.914222 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:13Z","lastTransitionTime":"2026-03-09T16:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.925041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T16:00:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b37d92739e978c49645450a808f3296773a79b0f703d165fac39161f724b8a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T16:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znpxs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T16:00:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4mvxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.927257 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.931433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.931478 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.931492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.931510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:13 crc kubenswrapper[4831]: I0309 16:01:13.931523 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:13Z","lastTransitionTime":"2026-03-09T16:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.944720 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T16:01:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7f9fb4aa-30fc-49bf-b554-e009613f58b0\\\",\\\"systemUUID\\\":\\\"7d9f8d15-aabe-48b4-8d2f-58416afd8526\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T16:01:13Z is after 2025-08-24T17:21:41Z" Mar 09 16:01:13 crc kubenswrapper[4831]: E0309 16:01:13.944882 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 16:01:14 crc kubenswrapper[4831]: I0309 16:01:14.617096 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:14 crc kubenswrapper[4831]: E0309 16:01:14.617243 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:14 crc kubenswrapper[4831]: I0309 16:01:14.617586 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:14 crc kubenswrapper[4831]: I0309 16:01:14.617622 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:14 crc kubenswrapper[4831]: E0309 16:01:14.617691 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:14 crc kubenswrapper[4831]: I0309 16:01:14.617776 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:14 crc kubenswrapper[4831]: E0309 16:01:14.617881 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:14 crc kubenswrapper[4831]: E0309 16:01:14.617962 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:16 crc kubenswrapper[4831]: I0309 16:01:16.617368 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:16 crc kubenswrapper[4831]: I0309 16:01:16.617454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:16 crc kubenswrapper[4831]: I0309 16:01:16.617475 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:16 crc kubenswrapper[4831]: I0309 16:01:16.617475 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:16 crc kubenswrapper[4831]: E0309 16:01:16.617589 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:16 crc kubenswrapper[4831]: E0309 16:01:16.617743 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:16 crc kubenswrapper[4831]: E0309 16:01:16.617829 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:16 crc kubenswrapper[4831]: E0309 16:01:16.617908 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:18 crc kubenswrapper[4831]: I0309 16:01:18.616783 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:18 crc kubenswrapper[4831]: E0309 16:01:18.617080 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:18 crc kubenswrapper[4831]: I0309 16:01:18.617568 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:18 crc kubenswrapper[4831]: E0309 16:01:18.617717 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:18 crc kubenswrapper[4831]: I0309 16:01:18.617752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:18 crc kubenswrapper[4831]: I0309 16:01:18.617810 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:18 crc kubenswrapper[4831]: E0309 16:01:18.617924 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:18 crc kubenswrapper[4831]: E0309 16:01:18.618047 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:18 crc kubenswrapper[4831]: E0309 16:01:18.744042 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:20 crc kubenswrapper[4831]: I0309 16:01:20.616472 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:20 crc kubenswrapper[4831]: I0309 16:01:20.616567 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:20 crc kubenswrapper[4831]: I0309 16:01:20.616588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:20 crc kubenswrapper[4831]: E0309 16:01:20.616619 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:20 crc kubenswrapper[4831]: I0309 16:01:20.616499 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:20 crc kubenswrapper[4831]: E0309 16:01:20.616699 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:20 crc kubenswrapper[4831]: E0309 16:01:20.616778 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:20 crc kubenswrapper[4831]: E0309 16:01:20.616956 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:22 crc kubenswrapper[4831]: I0309 16:01:22.616706 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:22 crc kubenswrapper[4831]: I0309 16:01:22.616830 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:22 crc kubenswrapper[4831]: E0309 16:01:22.616902 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:22 crc kubenswrapper[4831]: I0309 16:01:22.616943 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:22 crc kubenswrapper[4831]: I0309 16:01:22.616939 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:22 crc kubenswrapper[4831]: E0309 16:01:22.617123 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:22 crc kubenswrapper[4831]: E0309 16:01:22.617310 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:22 crc kubenswrapper[4831]: E0309 16:01:22.617388 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.681714 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9c746" podStartSLOduration=142.681686653 podStartE2EDuration="2m22.681686653s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.681654882 +0000 UTC m=+210.815337305" watchObservedRunningTime="2026-03-09 16:01:23.681686653 +0000 UTC m=+210.815369106" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.723671 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sdswt" podStartSLOduration=142.723641703 podStartE2EDuration="2m22.723641703s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.706270923 +0000 UTC m=+210.839953346" watchObservedRunningTime="2026-03-09 16:01:23.723641703 +0000 UTC m=+210.857324126" Mar 09 16:01:23 crc kubenswrapper[4831]: E0309 16:01:23.744562 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.753470 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-46nwt" podStartSLOduration=142.753444651 podStartE2EDuration="2m22.753444651s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.72466862 +0000 UTC m=+210.858351043" watchObservedRunningTime="2026-03-09 16:01:23.753444651 +0000 UTC m=+210.887127074" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.765290 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq8r6" podStartSLOduration=141.765268154 podStartE2EDuration="2m21.765268154s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.765233543 +0000 UTC m=+210.898915986" watchObservedRunningTime="2026-03-09 16:01:23.765268154 +0000 UTC m=+210.898950597" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.778089 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kkb76" podStartSLOduration=142.778065732 podStartE2EDuration="2m22.778065732s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.777829446 +0000 UTC m=+210.911511869" watchObservedRunningTime="2026-03-09 16:01:23.778065732 +0000 UTC m=+210.911748155" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.793510 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.793492361 podStartE2EDuration="38.793492361s" podCreationTimestamp="2026-03-09 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.792993997 +0000 UTC m=+210.926676420" watchObservedRunningTime="2026-03-09 16:01:23.793492361 +0000 UTC m=+210.927174794" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.824673 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=55.824657325 podStartE2EDuration="55.824657325s" podCreationTimestamp="2026-03-09 16:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.822917269 +0000 UTC m=+210.956599692" watchObservedRunningTime="2026-03-09 16:01:23.824657325 +0000 UTC m=+210.958339748" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.841891 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=61.84186164 podStartE2EDuration="1m1.84186164s" podCreationTimestamp="2026-03-09 16:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.841250014 +0000 UTC m=+210.974932477" watchObservedRunningTime="2026-03-09 16:01:23.84186164 +0000 UTC m=+210.975544093" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.924887 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podStartSLOduration=142.924859156 podStartE2EDuration="2m22.924859156s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.924490476 +0000 UTC m=+211.058172899" watchObservedRunningTime="2026-03-09 16:01:23.924859156 +0000 UTC m=+211.058541569" Mar 09 16:01:23 crc kubenswrapper[4831]: I0309 16:01:23.955862 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=70.955841265 podStartE2EDuration="1m10.955841265s" podCreationTimestamp="2026-03-09 16:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:23.94279951 +0000 UTC m=+211.076481933" watchObservedRunningTime="2026-03-09 16:01:23.955841265 +0000 UTC m=+211.089523688" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.147554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.147608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.147620 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.147640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.147653 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T16:01:24Z","lastTransitionTime":"2026-03-09T16:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.203305 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd"] Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.203731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.207811 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.207861 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.208004 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.208077 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.279671 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38a9308-965d-4997-b940-33d461a122fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.280085 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38a9308-965d-4997-b940-33d461a122fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.280136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e38a9308-965d-4997-b940-33d461a122fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.280174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.280203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381311 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38a9308-965d-4997-b940-33d461a122fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38a9308-965d-4997-b940-33d461a122fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381418 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e38a9308-965d-4997-b940-33d461a122fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381462 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381493 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.381544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.382605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e38a9308-965d-4997-b940-33d461a122fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.383334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e38a9308-965d-4997-b940-33d461a122fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.393436 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38a9308-965d-4997-b940-33d461a122fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.399329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38a9308-965d-4997-b940-33d461a122fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2ksd\" (UID: \"e38a9308-965d-4997-b940-33d461a122fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.518887 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.606682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" event={"ID":"e38a9308-965d-4997-b940-33d461a122fd","Type":"ContainerStarted","Data":"99fc37f3c377938c1b381c4ceee3aa77af0d738d18f2587d51e6df5b5fe32d22"} Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.616646 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.616727 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.617192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.617552 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:24 crc kubenswrapper[4831]: E0309 16:01:24.617766 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:24 crc kubenswrapper[4831]: E0309 16:01:24.617856 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:24 crc kubenswrapper[4831]: E0309 16:01:24.617912 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:24 crc kubenswrapper[4831]: E0309 16:01:24.618232 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.659431 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 16:01:24 crc kubenswrapper[4831]: I0309 16:01:24.669142 4831 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 16:01:25 crc kubenswrapper[4831]: I0309 16:01:25.611849 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" event={"ID":"e38a9308-965d-4997-b940-33d461a122fd","Type":"ContainerStarted","Data":"d60631ebf9e10b8399a12e12631392a9672319b20c3fc2ac43cdd7c8415ae495"} Mar 09 16:01:25 crc kubenswrapper[4831]: I0309 16:01:25.618261 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:01:25 crc kubenswrapper[4831]: I0309 16:01:25.618410 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:01:25 crc kubenswrapper[4831]: E0309 16:01:25.618503 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7jxjf_openshift-ovn-kubernetes(498bff7b-8be5-4e87-8717-0de7f7a8b877)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" Mar 09 16:01:25 crc kubenswrapper[4831]: I0309 16:01:25.629072 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2ksd" podStartSLOduration=144.629050828 podStartE2EDuration="2m24.629050828s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:25.628684998 +0000 UTC m=+212.762367421" watchObservedRunningTime="2026-03-09 16:01:25.629050828 +0000 UTC m=+212.762733251" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.616478 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:26 crc kubenswrapper[4831]: E0309 16:01:26.616679 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.616709 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.616771 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.616804 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:26 crc kubenswrapper[4831]: E0309 16:01:26.617087 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:26 crc kubenswrapper[4831]: E0309 16:01:26.617386 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:26 crc kubenswrapper[4831]: E0309 16:01:26.617476 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.619875 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.621850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee"} Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.622214 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:01:26 crc kubenswrapper[4831]: I0309 16:01:26.644535 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.644508811 podStartE2EDuration="1m20.644508811s" podCreationTimestamp="2026-03-09 16:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:26.644324676 +0000 UTC m=+213.778007099" watchObservedRunningTime="2026-03-09 16:01:26.644508811 +0000 UTC m=+213.778191314" Mar 09 16:01:28 crc kubenswrapper[4831]: I0309 16:01:28.616469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:28 crc kubenswrapper[4831]: I0309 16:01:28.616488 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:28 crc kubenswrapper[4831]: I0309 16:01:28.616563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:28 crc kubenswrapper[4831]: E0309 16:01:28.617580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:28 crc kubenswrapper[4831]: E0309 16:01:28.617381 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:28 crc kubenswrapper[4831]: E0309 16:01:28.617679 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:28 crc kubenswrapper[4831]: I0309 16:01:28.617011 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:28 crc kubenswrapper[4831]: E0309 16:01:28.617964 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:28 crc kubenswrapper[4831]: E0309 16:01:28.746450 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:30 crc kubenswrapper[4831]: I0309 16:01:30.616964 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:30 crc kubenswrapper[4831]: I0309 16:01:30.617027 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:30 crc kubenswrapper[4831]: I0309 16:01:30.617026 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:30 crc kubenswrapper[4831]: I0309 16:01:30.617150 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:30 crc kubenswrapper[4831]: E0309 16:01:30.617159 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:30 crc kubenswrapper[4831]: E0309 16:01:30.617273 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:30 crc kubenswrapper[4831]: E0309 16:01:30.617513 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:30 crc kubenswrapper[4831]: E0309 16:01:30.617666 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:32 crc kubenswrapper[4831]: I0309 16:01:32.616756 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:32 crc kubenswrapper[4831]: I0309 16:01:32.616824 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:32 crc kubenswrapper[4831]: I0309 16:01:32.616824 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:32 crc kubenswrapper[4831]: E0309 16:01:32.616972 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:32 crc kubenswrapper[4831]: I0309 16:01:32.617043 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:32 crc kubenswrapper[4831]: E0309 16:01:32.617211 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:32 crc kubenswrapper[4831]: E0309 16:01:32.617339 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:32 crc kubenswrapper[4831]: E0309 16:01:32.617498 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:33 crc kubenswrapper[4831]: E0309 16:01:33.746910 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:34 crc kubenswrapper[4831]: I0309 16:01:34.617441 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:34 crc kubenswrapper[4831]: E0309 16:01:34.617934 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:34 crc kubenswrapper[4831]: I0309 16:01:34.617606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:34 crc kubenswrapper[4831]: E0309 16:01:34.618039 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:34 crc kubenswrapper[4831]: I0309 16:01:34.617606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:34 crc kubenswrapper[4831]: E0309 16:01:34.618115 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:34 crc kubenswrapper[4831]: I0309 16:01:34.617461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:34 crc kubenswrapper[4831]: E0309 16:01:34.618184 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.616877 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.616935 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.617075 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.617219 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:36 crc kubenswrapper[4831]: E0309 16:01:36.617464 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:36 crc kubenswrapper[4831]: E0309 16:01:36.617621 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.617676 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:01:36 crc kubenswrapper[4831]: E0309 16:01:36.617795 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:36 crc kubenswrapper[4831]: E0309 16:01:36.618022 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.659779 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/1.log" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.660258 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/0.log" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.660308 4831 generic.go:334] "Generic (PLEG): container finished" podID="c53277d4-7695-47e5-bacc-e6ab6dca1501" containerID="8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6" exitCode=1 Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.660342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerDied","Data":"8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6"} Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.660379 4831 scope.go:117] "RemoveContainer" containerID="f76c515bce7e8b531966031ad82dbd51a9a5224b5a25a9726f93a7b175e3d41b" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.660804 4831 scope.go:117] "RemoveContainer" containerID="8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6" Mar 09 16:01:36 crc kubenswrapper[4831]: E0309 16:01:36.660959 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9c746_openshift-multus(c53277d4-7695-47e5-bacc-e6ab6dca1501)\"" pod="openshift-multus/multus-9c746" podUID="c53277d4-7695-47e5-bacc-e6ab6dca1501" Mar 09 16:01:36 crc kubenswrapper[4831]: I0309 16:01:36.671874 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.654240 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2597x"] Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.654550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:37 crc kubenswrapper[4831]: E0309 16:01:37.654744 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.668216 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.673642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerStarted","Data":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.675563 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/1.log" Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.676391 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:01:37 crc kubenswrapper[4831]: I0309 16:01:37.726903 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podStartSLOduration=156.726877134 podStartE2EDuration="2m36.726877134s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:37.72596142 +0000 UTC m=+224.859643893" watchObservedRunningTime="2026-03-09 16:01:37.726877134 +0000 UTC m=+224.860559597" Mar 09 16:01:38 crc kubenswrapper[4831]: I0309 16:01:38.617418 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:38 crc kubenswrapper[4831]: E0309 16:01:38.617868 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:38 crc kubenswrapper[4831]: I0309 16:01:38.617536 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:38 crc kubenswrapper[4831]: E0309 16:01:38.617947 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:38 crc kubenswrapper[4831]: I0309 16:01:38.617466 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:38 crc kubenswrapper[4831]: E0309 16:01:38.618001 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:38 crc kubenswrapper[4831]: E0309 16:01:38.749072 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:39 crc kubenswrapper[4831]: I0309 16:01:39.616718 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:39 crc kubenswrapper[4831]: E0309 16:01:39.616861 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:40 crc kubenswrapper[4831]: I0309 16:01:40.616375 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:40 crc kubenswrapper[4831]: E0309 16:01:40.616533 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:40 crc kubenswrapper[4831]: I0309 16:01:40.616563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:40 crc kubenswrapper[4831]: E0309 16:01:40.616637 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:40 crc kubenswrapper[4831]: I0309 16:01:40.616542 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:40 crc kubenswrapper[4831]: E0309 16:01:40.616709 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:41 crc kubenswrapper[4831]: I0309 16:01:41.617027 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:41 crc kubenswrapper[4831]: E0309 16:01:41.617272 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:42 crc kubenswrapper[4831]: I0309 16:01:42.616875 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:42 crc kubenswrapper[4831]: I0309 16:01:42.616922 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:42 crc kubenswrapper[4831]: I0309 16:01:42.616875 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:42 crc kubenswrapper[4831]: E0309 16:01:42.617062 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:42 crc kubenswrapper[4831]: E0309 16:01:42.617160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:42 crc kubenswrapper[4831]: E0309 16:01:42.617353 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:43 crc kubenswrapper[4831]: I0309 16:01:43.617307 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:43 crc kubenswrapper[4831]: E0309 16:01:43.618622 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:43 crc kubenswrapper[4831]: E0309 16:01:43.749572 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:44 crc kubenswrapper[4831]: I0309 16:01:44.617291 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:44 crc kubenswrapper[4831]: I0309 16:01:44.617429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:44 crc kubenswrapper[4831]: E0309 16:01:44.617870 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:44 crc kubenswrapper[4831]: I0309 16:01:44.617466 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:44 crc kubenswrapper[4831]: E0309 16:01:44.617998 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:44 crc kubenswrapper[4831]: E0309 16:01:44.618061 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:45 crc kubenswrapper[4831]: I0309 16:01:45.617229 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:45 crc kubenswrapper[4831]: E0309 16:01:45.617473 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:46 crc kubenswrapper[4831]: I0309 16:01:46.617388 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:46 crc kubenswrapper[4831]: I0309 16:01:46.617594 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:46 crc kubenswrapper[4831]: I0309 16:01:46.617592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:46 crc kubenswrapper[4831]: E0309 16:01:46.617789 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:46 crc kubenswrapper[4831]: E0309 16:01:46.617963 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:46 crc kubenswrapper[4831]: E0309 16:01:46.618172 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:47 crc kubenswrapper[4831]: I0309 16:01:47.616915 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:47 crc kubenswrapper[4831]: E0309 16:01:47.617080 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:48 crc kubenswrapper[4831]: I0309 16:01:48.616924 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:48 crc kubenswrapper[4831]: I0309 16:01:48.616976 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:48 crc kubenswrapper[4831]: E0309 16:01:48.617119 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:48 crc kubenswrapper[4831]: I0309 16:01:48.617157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:48 crc kubenswrapper[4831]: E0309 16:01:48.617376 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:48 crc kubenswrapper[4831]: E0309 16:01:48.617507 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:48 crc kubenswrapper[4831]: I0309 16:01:48.618270 4831 scope.go:117] "RemoveContainer" containerID="8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6" Mar 09 16:01:48 crc kubenswrapper[4831]: E0309 16:01:48.750835 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:01:49 crc kubenswrapper[4831]: I0309 16:01:49.616970 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:49 crc kubenswrapper[4831]: E0309 16:01:49.617175 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:49 crc kubenswrapper[4831]: I0309 16:01:49.721067 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/1.log" Mar 09 16:01:49 crc kubenswrapper[4831]: I0309 16:01:49.721147 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerStarted","Data":"b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062"} Mar 09 16:01:50 crc kubenswrapper[4831]: I0309 16:01:50.616558 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:50 crc kubenswrapper[4831]: I0309 16:01:50.616669 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:50 crc kubenswrapper[4831]: E0309 16:01:50.616745 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:50 crc kubenswrapper[4831]: I0309 16:01:50.616767 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:50 crc kubenswrapper[4831]: E0309 16:01:50.616897 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:50 crc kubenswrapper[4831]: E0309 16:01:50.617038 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:51 crc kubenswrapper[4831]: I0309 16:01:51.617227 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:51 crc kubenswrapper[4831]: E0309 16:01:51.617468 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:52 crc kubenswrapper[4831]: I0309 16:01:52.616445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:52 crc kubenswrapper[4831]: I0309 16:01:52.616531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:52 crc kubenswrapper[4831]: I0309 16:01:52.616445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:52 crc kubenswrapper[4831]: E0309 16:01:52.616660 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 16:01:52 crc kubenswrapper[4831]: E0309 16:01:52.616834 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 16:01:52 crc kubenswrapper[4831]: E0309 16:01:52.616989 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 16:01:53 crc kubenswrapper[4831]: I0309 16:01:53.617346 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:53 crc kubenswrapper[4831]: E0309 16:01:53.618908 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2597x" podUID="bdf8f784-8094-4b1c-96bb-f7997430a0ea" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.545551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.593167 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rd78d"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.593840 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.594437 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lg8th"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.595782 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.595903 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.596704 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.596883 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.597608 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.601706 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wgqf6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.602523 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.617215 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.617800 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnsfp"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.617913 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.618112 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.618192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.618584 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.618658 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.618670 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.619608 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.619880 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.619988 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620034 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620183 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620214 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620289 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-node-pullsecrets\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-client\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620391 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620497 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit-dir\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-serving-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620559 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-serving-cert\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620601 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-image-import-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620639 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-encryption-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620688 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cql28\" (UniqueName: \"kubernetes.io/projected/bad40e77-ebaa-48c9-a463-b2e821fbe30f-kube-api-access-cql28\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620714 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620805 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.620840 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.621730 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622150 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622247 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622486 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622512 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622667 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.622840 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623007 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623189 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623315 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2nnxk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623480 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623327 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623359 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623385 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623653 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623718 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.623878 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.624112 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.624122 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.625220 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.625542 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.625727 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.628295 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.629561 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.630063 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.630639 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.631921 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.633724 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.633882 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.633979 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.634140 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.634202 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.634446 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.634564 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.640972 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.641580 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.645160 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.646445 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fgls5"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.647137 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.649259 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.649787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.653353 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.653881 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.665903 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.667217 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.667709 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.667917 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668045 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668155 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668289 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668508 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668870 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.669182 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.669320 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.669786 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.670092 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.671505 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.671747 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.672328 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.672587 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.672854 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673246 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673418 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673630 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673694 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673831 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673996 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.674134 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.674267 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668059 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.673487 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.672371 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.674729 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.674837 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.674841 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.668099 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.675684 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.680310 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.680484 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.694572 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.694731 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695131 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695384 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695541 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695726 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695751 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695844 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695922 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695940 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.695946 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.698184 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ln52t"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699028 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699515 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699649 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699673 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.699962 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.701602 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.702351 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.703470 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.706830 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.710027 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.710608 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.710774 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.711672 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.712186 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.712463 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ntw62"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.712809 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.713159 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.713557 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.713737 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.713748 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.713967 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.714811 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.715743 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.716125 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.716948 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-q7h8z"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.717379 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.717740 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.717985 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.718135 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5gntv"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.718506 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.718847 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fcd87"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.719739 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.719762 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735540 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-encryption-config\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735587 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735609 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-service-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-client\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735656 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlc9m\" (UniqueName: \"kubernetes.io/projected/f9b292d9-3fc4-40cb-a74b-00c849999f8c-kube-api-access-vlc9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-config\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735693 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735730 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-images\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtfc\" (UniqueName: \"kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b292d9-3fc4-40cb-a74b-00c849999f8c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b292d9-3fc4-40cb-a74b-00c849999f8c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735834 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735853 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8xrt\" (UniqueName: \"kubernetes.io/projected/314f92cb-af76-454e-b67c-f056477de5e9-kube-api-access-n8xrt\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-serving-cert\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735890 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-client\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463e7811-9c30-4106-942b-19deb65f748c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-image-import-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.735966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q48j\" (UniqueName: \"kubernetes.io/projected/62d263d2-2a73-401e-9922-59d5939d6b24-kube-api-access-5q48j\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736008 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736026 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdhv\" (UniqueName: \"kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736047 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736065 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736081 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8bw\" (UniqueName: \"kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736100 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870c85dc-0d14-4681-be3c-9e87bee849d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-trusted-ca\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736152 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-serving-cert\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj7jq\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-kube-api-access-qj7jq\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/870c85dc-0d14-4681-be3c-9e87bee849d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736241 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0fb443-0ee4-4505-b781-b14c49d069bf-serving-cert\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736275 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736317 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-node-pullsecrets\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736332 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463e7811-9c30-4106-942b-19deb65f748c-config\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736379 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463e7811-9c30-4106-942b-19deb65f748c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736458 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit-dir\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736477 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/314f92cb-af76-454e-b67c-f056477de5e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736592 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfgs\" (UniqueName: \"kubernetes.io/projected/a18279da-6931-4175-af9d-5fca5a160c91-kube-api-access-xmfgs\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-serving-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-serving-cert\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736699 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736720 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736874 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-config\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.736991 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-encryption-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737040 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-audit-policies\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k24p\" (UniqueName: \"kubernetes.io/projected/1a0fb443-0ee4-4505-b781-b14c49d069bf-kube-api-access-8k24p\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737255 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cql28\" (UniqueName: \"kubernetes.io/projected/bad40e77-ebaa-48c9-a463-b2e821fbe30f-kube-api-access-cql28\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737435 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737452 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d263d2-2a73-401e-9922-59d5939d6b24-audit-dir\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737531 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737548 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-etcd-client\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-config\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.737871 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.753491 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.742157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.756092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.757283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.757966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.758131 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.758145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit-dir\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.758289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.758469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.758623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-audit\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.759811 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.760818 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-image-import-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.761164 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.762502 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-serving-ca\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.762652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-encryption-config\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.762679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bad40e77-ebaa-48c9-a463-b2e821fbe30f-node-pullsecrets\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.765176 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-serving-cert\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.765771 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lkkk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.766416 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.766592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.766592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.764820 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bad40e77-ebaa-48c9-a463-b2e821fbe30f-etcd-client\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.766814 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.767166 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.766705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.767562 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvs22"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.767877 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.768248 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551200-x7tx8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.768415 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.769407 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.769601 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.769849 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.770177 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.771568 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.771766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.771847 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772005 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772326 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772597 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772633 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772934 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.772965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.773668 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnsfp"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.775151 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lg8th"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.777637 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.779482 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rd78d"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.781772 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.781999 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.782022 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2nnxk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.782559 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.782859 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4k4zw"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.783709 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.783922 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wgqf6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.787500 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.789426 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.791502 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.794372 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.801945 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.806471 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ntw62"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.808220 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.809829 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.812522 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5gntv"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.814424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.817579 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.817612 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fcd87"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.819628 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.819685 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.820721 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lkkk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.821779 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fgls5"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.831706 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ln52t"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.838314 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.838350 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841486 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463e7811-9c30-4106-942b-19deb65f748c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841523 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/314f92cb-af76-454e-b67c-f056477de5e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841590 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-serving-cert\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfgs\" (UniqueName: \"kubernetes.io/projected/a18279da-6931-4175-af9d-5fca5a160c91-kube-api-access-xmfgs\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-config\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-audit-policies\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841712 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k24p\" (UniqueName: \"kubernetes.io/projected/1a0fb443-0ee4-4505-b781-b14c49d069bf-kube-api-access-8k24p\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d263d2-2a73-401e-9922-59d5939d6b24-audit-dir\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-etcd-client\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-config\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841861 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-encryption-config\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-service-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlc9m\" (UniqueName: \"kubernetes.io/projected/f9b292d9-3fc4-40cb-a74b-00c849999f8c-kube-api-access-vlc9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-config\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841955 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-images\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtfc\" (UniqueName: \"kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.841999 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b292d9-3fc4-40cb-a74b-00c849999f8c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842015 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b292d9-3fc4-40cb-a74b-00c849999f8c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842095 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8xrt\" (UniqueName: \"kubernetes.io/projected/314f92cb-af76-454e-b67c-f056477de5e9-kube-api-access-n8xrt\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-client\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463e7811-9c30-4106-942b-19deb65f748c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q48j\" (UniqueName: \"kubernetes.io/projected/62d263d2-2a73-401e-9922-59d5939d6b24-kube-api-access-5q48j\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842168 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdhv\" (UniqueName: \"kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8bw\" (UniqueName: \"kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870c85dc-0d14-4681-be3c-9e87bee849d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842256 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-trusted-ca\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842287 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-serving-cert\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj7jq\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-kube-api-access-qj7jq\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842318 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/870c85dc-0d14-4681-be3c-9e87bee849d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842363 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0fb443-0ee4-4505-b781-b14c49d069bf-serving-cert\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842379 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.842408 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463e7811-9c30-4106-942b-19deb65f748c-config\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.843057 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463e7811-9c30-4106-942b-19deb65f748c-config\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.850445 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.850677 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-images\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.850733 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.850735 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.851547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-config\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.851620 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.851853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-audit-policies\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.852950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.852981 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a18279da-6931-4175-af9d-5fca5a160c91-etcd-service-ca\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.853586 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314f92cb-af76-454e-b67c-f056477de5e9-config\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.853637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d263d2-2a73-401e-9922-59d5939d6b24-audit-dir\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.853712 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b292d9-3fc4-40cb-a74b-00c849999f8c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854112 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854193 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b292d9-3fc4-40cb-a74b-00c849999f8c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.854771 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.855876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.856533 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-config\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.856654 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.857194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.857296 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.857857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-etcd-client\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.858230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.858472 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.858718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.859125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.859263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a0fb443-0ee4-4505-b781-b14c49d069bf-trusted-ca\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.859334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.859354 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-encryption-config\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.859761 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870c85dc-0d14-4681-be3c-9e87bee849d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.860358 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864017 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0fb443-0ee4-4505-b781-b14c49d069bf-serving-cert\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-etcd-client\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a18279da-6931-4175-af9d-5fca5a160c91-serving-cert\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864726 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463e7811-9c30-4106-942b-19deb65f748c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.864755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.865039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/870c85dc-0d14-4681-be3c-9e87bee849d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.865218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.865534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.867070 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d263d2-2a73-401e-9922-59d5939d6b24-serving-cert\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.867980 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/314f92cb-af76-454e-b67c-f056477de5e9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.873877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.876584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.878631 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.879656 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.881381 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551200-x7tx8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.882666 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.883930 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2259m"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.884741 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.885393 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.886589 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7slb"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.889015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.889123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.889330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.890769 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvs22"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.892074 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.894007 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z4ljk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.895052 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2259m"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.895218 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.895813 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.900551 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7slb"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.900689 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z4ljk"] Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.900502 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.915705 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.935834 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.955931 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.975461 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 16:01:54 crc kubenswrapper[4831]: I0309 16:01:54.995606 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.016216 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.037290 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.056277 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.076852 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.096007 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.116875 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.137177 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.157379 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.177776 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.196870 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.216715 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.237674 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.257047 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.276199 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.296705 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.316148 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.335858 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.356720 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.396976 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.406042 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cql28\" (UniqueName: \"kubernetes.io/projected/bad40e77-ebaa-48c9-a463-b2e821fbe30f-kube-api-access-cql28\") pod \"apiserver-76f77b778f-rd78d\" (UID: \"bad40e77-ebaa-48c9-a463-b2e821fbe30f\") " pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.420330 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.436126 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.458070 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.477282 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.496520 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.517499 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.520615 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.540537 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.557110 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.577103 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.597238 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.615840 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.616928 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.636982 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.656946 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.676736 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.697385 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.717662 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.736548 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.756219 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.775104 4831 request.go:700] Waited for 1.008625152s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.776963 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.797238 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.816145 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.836943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.855592 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.876488 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.896724 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.917733 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.935884 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.956570 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.975785 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.993375 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rd78d"] Mar 09 16:01:55 crc kubenswrapper[4831]: I0309 16:01:55.997467 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.016425 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.037135 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.056433 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.077553 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.097674 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.117158 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.136824 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.156298 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.176635 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.196289 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.216738 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.243493 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.256053 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.276611 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.296837 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.316911 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.337101 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.357448 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.378070 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.396599 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.416170 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.437703 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.457968 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.476507 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.496127 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.516426 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.536616 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.555609 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.576791 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.657561 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtfc\" (UniqueName: \"kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc\") pod \"route-controller-manager-6576b87f9c-k4n9g\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.663726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.663871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.663951 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-trusted-ca-bundle\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664010 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-oauth-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664072 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664139 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-service-ca\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-oauth-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-console-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664295 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664327 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9dgq\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7r2x\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-kube-api-access-r7r2x\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664451 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664638 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-auth-proxy-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664764 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfh5\" (UniqueName: \"kubernetes.io/projected/642242a2-404d-4008-aacf-ebb38010d636-kube-api-access-8hfh5\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664911 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.664972 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f464ea06-1df1-45a9-9a75-d513ac1de15e-machine-approver-tls\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.665017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.665099 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.665140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4qv\" (UniqueName: \"kubernetes.io/projected/f464ea06-1df1-45a9-9a75-d513ac1de15e-kube-api-access-pv4qv\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.665173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.665504 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: E0309 16:01:56.665937 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.165913268 +0000 UTC m=+244.299595781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.679024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k24p\" (UniqueName: \"kubernetes.io/projected/1a0fb443-0ee4-4505-b781-b14c49d069bf-kube-api-access-8k24p\") pod \"console-operator-58897d9998-jnsfp\" (UID: \"1a0fb443-0ee4-4505-b781-b14c49d069bf\") " pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.691066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfgs\" (UniqueName: \"kubernetes.io/projected/a18279da-6931-4175-af9d-5fca5a160c91-kube-api-access-xmfgs\") pod \"etcd-operator-b45778765-fgls5\" (UID: \"a18279da-6931-4175-af9d-5fca5a160c91\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.700811 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.716719 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463e7811-9c30-4106-942b-19deb65f748c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2jn8\" (UID: \"463e7811-9c30-4106-942b-19deb65f748c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.726489 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.734145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q48j\" (UniqueName: \"kubernetes.io/projected/62d263d2-2a73-401e-9922-59d5939d6b24-kube-api-access-5q48j\") pod \"apiserver-7bbb656c7d-db24t\" (UID: \"62d263d2-2a73-401e-9922-59d5939d6b24\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.734548 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.763611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdhv\" (UniqueName: \"kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv\") pod \"controller-manager-879f6c89f-7c7t6\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.768750 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.768982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f464ea06-1df1-45a9-9a75-d513ac1de15e-machine-approver-tls\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-config\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxhd\" (UniqueName: \"kubernetes.io/projected/09e969ab-296e-4da0-8347-66c9227a149a-kube-api-access-lfxhd\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769096 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28f45\" (UniqueName: \"kubernetes.io/projected/e5882f28-881f-4b29-91b1-d2170207949c-kube-api-access-28f45\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769125 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqfjh\" (UniqueName: \"kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-csi-data-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769206 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769238 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4qv\" (UniqueName: \"kubernetes.io/projected/f464ea06-1df1-45a9-9a75-d513ac1de15e-kube-api-access-pv4qv\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769307 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769435 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-plugins-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98b2ff36-f824-46b5-ad2a-d01710453cb2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpsm\" (UniqueName: \"kubernetes.io/projected/98b2ff36-f824-46b5-ad2a-d01710453cb2-kube-api-access-xlpsm\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769627 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-registration-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769677 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6723899f-ecd5-4a9b-abd9-c824d873ae92-serving-cert\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769710 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfaaf56f-0a5d-434c-8bc7-f764c883e204-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769833 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-mountpoint-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-images\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-trusted-ca-bundle\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769934 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4km\" (UniqueName: \"kubernetes.io/projected/406f198b-ce58-49b3-a79d-1793c51985fc-kube-api-access-zl4km\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.769971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5xck\" (UniqueName: \"kubernetes.io/projected/09973ee5-21ce-4c4f-b422-dea474d63482-kube-api-access-k5xck\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-profile-collector-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-node-bootstrap-token\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-webhook-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770172 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770214 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a864a900-0b20-49ae-a846-736a9784eee1-proxy-tls\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-console-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770327 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09e969ab-296e-4da0-8347-66c9227a149a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770360 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7ac9e931-216f-42bf-9364-f39b8cbe2b60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dg7\" (UniqueName: \"kubernetes.io/projected/abca9390-cbde-4734-910a-433bc590e42a-kube-api-access-27dg7\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770446 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770498 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9dgq\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght6f\" (UniqueName: \"kubernetes.io/projected/6723899f-ecd5-4a9b-abd9-c824d873ae92-kube-api-access-ght6f\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770597 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c5d72ddf-8850-405d-aa67-ae759dad90be-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75ce18b8-5e44-47af-801c-97f9963d1786-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770686 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxlk\" (UniqueName: \"kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770719 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbpg\" (UniqueName: \"kubernetes.io/projected/a650250c-ac76-4486-a46d-cd53b83afe05-kube-api-access-cwbpg\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-auth-proxy-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaaf56f-0a5d-434c-8bc7-f764c883e204-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnpd\" (UniqueName: \"kubernetes.io/projected/5630f338-fb68-4efc-8eb0-4b2c2fcae913-kube-api-access-plnpd\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.770909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771005 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771038 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-srv-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771143 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/247926dc-e09f-4518-874a-d349acc3c7cd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d94f81-ac48-4829-b0c0-54339568c8f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771393 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5tx\" (UniqueName: \"kubernetes.io/projected/3fb50200-700a-4355-8e9b-65b2cb24dc02-kube-api-access-qs5tx\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771441 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: E0309 16:01:56.771658 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.271637605 +0000 UTC m=+244.405320038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.776189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-auth-proxy-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.776848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-console-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.777415 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f464ea06-1df1-45a9-9a75-d513ac1de15e-machine-approver-tls\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.778658 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.779517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.781064 4831 request.go:700] Waited for 1.92406668s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.771447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5630f338-fb68-4efc-8eb0-4b2c2fcae913-proxy-tls\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.783715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-trusted-ca-bundle\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.783743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b2ff36-f824-46b5-ad2a-d01710453cb2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.783850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfaaf56f-0a5d-434c-8bc7-f764c883e204-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784139 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462165ba-5266-4604-a449-6dbd4faf67b3-config\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784228 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a650250c-ac76-4486-a46d-cd53b83afe05-config-volume\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-stats-auth\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784599 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d8d\" (UniqueName: \"kubernetes.io/projected/b62fcd76-8790-4a1e-898d-b9654a876ddc-kube-api-access-x7d8d\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784721 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6b8c\" (UniqueName: \"kubernetes.io/projected/0af5bc7b-a373-45fc-b972-17c5a31f317e-kube-api-access-m6b8c\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784810 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppc86\" (UniqueName: \"kubernetes.io/projected/f9d94f81-ac48-4829-b0c0-54339568c8f0-kube-api-access-ppc86\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/462165ba-5266-4604-a449-6dbd4faf67b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-metrics-certs\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.784995 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.785048 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fb50200-700a-4355-8e9b-65b2cb24dc02-cert\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.785175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-key\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.785220 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkb4\" (UniqueName: \"kubernetes.io/projected/247926dc-e09f-4518-874a-d349acc3c7cd-kube-api-access-2kkb4\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.785268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.785314 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462165ba-5266-4604-a449-6dbd4faf67b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.787657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/abca9390-cbde-4734-910a-433bc590e42a-metrics-tls\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.792418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tl2n\" (UniqueName: \"kubernetes.io/projected/75ce18b8-5e44-47af-801c-97f9963d1786-kube-api-access-8tl2n\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.793064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d94f81-ac48-4829-b0c0-54339568c8f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.793199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.793372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqqt\" (UniqueName: \"kubernetes.io/projected/a864a900-0b20-49ae-a846-736a9784eee1-kube-api-access-9rqqt\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.793559 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6723899f-ecd5-4a9b-abd9-c824d873ae92-config\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.793811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-serving-cert\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.794129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlfm\" (UniqueName: \"kubernetes.io/projected/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-kube-api-access-zrlfm\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.794467 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcr8\" (UniqueName: \"kubernetes.io/projected/24253089-f34c-4d7a-816f-49c18af92c20-kube-api-access-mxcr8\") pod \"downloads-7954f5f757-ntw62\" (UID: \"24253089-f34c-4d7a-816f-49c18af92c20\") " pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.794609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-oauth-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.795372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqks8\" (UniqueName: \"kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8\") pod \"auto-csr-approver-29551200-x7tx8\" (UID: \"a013f059-7440-4f06-88f6-a73f3286d228\") " pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.795472 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac9e931-216f-42bf-9364-f39b8cbe2b60-serving-cert\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.795578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrtg\" (UniqueName: \"kubernetes.io/projected/7ac9e931-216f-42bf-9364-f39b8cbe2b60-kube-api-access-pfrtg\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.795985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.796062 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.796302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.796434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a864a900-0b20-49ae-a846-736a9784eee1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.796575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwwm\" (UniqueName: \"kubernetes.io/projected/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-kube-api-access-fcwwm\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.797303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.797421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afa7f33f-fe13-438d-bd39-594f917e6015-tmpfs\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.797467 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-service-ca\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.797948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-oauth-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798047 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-certs\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09973ee5-21ce-4c4f-b422-dea474d63482-service-ca-bundle\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjs2\" (UniqueName: \"kubernetes.io/projected/5032087a-e1c3-47cb-9b31-2983fe8aff99-kube-api-access-dbjs2\") pod \"migrator-59844c95c7-4vggg\" (UID: \"5032087a-e1c3-47cb-9b31-2983fe8aff99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7r2x\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-kube-api-access-r7r2x\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a650250c-ac76-4486-a46d-cd53b83afe05-metrics-tls\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798919 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-apiservice-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.798990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-srv-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfh5\" (UniqueName: \"kubernetes.io/projected/642242a2-404d-4008-aacf-ebb38010d636-kube-api-access-8hfh5\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg2nm\" (UniqueName: \"kubernetes.io/projected/c5d72ddf-8850-405d-aa67-ae759dad90be-kube-api-access-jg2nm\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799213 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-socket-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-default-certificate\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.799317 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtlp\" (UniqueName: \"kubernetes.io/projected/afa7f33f-fe13-438d-bd39-594f917e6015-kube-api-access-qrtlp\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.800031 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f464ea06-1df1-45a9-9a75-d513ac1de15e-config\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.800382 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj7jq\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-kube-api-access-qj7jq\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.800555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlc9m\" (UniqueName: \"kubernetes.io/projected/f9b292d9-3fc4-40cb-a74b-00c849999f8c-kube-api-access-vlc9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-rmhsf\" (UID: \"f9b292d9-3fc4-40cb-a74b-00c849999f8c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:56 crc kubenswrapper[4831]: E0309 16:01:56.800746 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.300720445 +0000 UTC m=+244.434402878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.802732 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.803451 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-oauth-serving-cert\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.803891 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/642242a2-404d-4008-aacf-ebb38010d636-service-ca\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.805930 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/642242a2-404d-4008-aacf-ebb38010d636-console-oauth-config\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.828731 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8xrt\" (UniqueName: \"kubernetes.io/projected/314f92cb-af76-454e-b67c-f056477de5e9-kube-api-access-n8xrt\") pod \"machine-api-operator-5694c8668f-lg8th\" (UID: \"314f92cb-af76-454e-b67c-f056477de5e9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.830744 4831 generic.go:334] "Generic (PLEG): container finished" podID="bad40e77-ebaa-48c9-a463-b2e821fbe30f" containerID="5edcf00ee44cb91a84492cc8c15a56a0f02644d45e4edda77f187eded38ea5a9" exitCode=0 Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.830785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" event={"ID":"bad40e77-ebaa-48c9-a463-b2e821fbe30f","Type":"ContainerDied","Data":"5edcf00ee44cb91a84492cc8c15a56a0f02644d45e4edda77f187eded38ea5a9"} Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.830815 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" event={"ID":"bad40e77-ebaa-48c9-a463-b2e821fbe30f","Type":"ContainerStarted","Data":"85af19b79c8400ee1abf9cc50e13358c4009d393143b39e400b954024fb893c0"} Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.836992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870c85dc-0d14-4681-be3c-9e87bee849d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4j68p\" (UID: \"870c85dc-0d14-4681-be3c-9e87bee849d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.855111 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8bw\" (UniqueName: \"kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw\") pod \"oauth-openshift-558db77b4-2nnxk\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.857011 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.878750 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.897928 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.898460 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902537 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:56 crc kubenswrapper[4831]: E0309 16:01:56.902658 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.40263772 +0000 UTC m=+244.536320143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902827 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a864a900-0b20-49ae-a846-736a9784eee1-proxy-tls\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09e969ab-296e-4da0-8347-66c9227a149a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7ac9e931-216f-42bf-9364-f39b8cbe2b60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902962 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dg7\" (UniqueName: \"kubernetes.io/projected/abca9390-cbde-4734-910a-433bc590e42a-kube-api-access-27dg7\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.902983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903012 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght6f\" (UniqueName: \"kubernetes.io/projected/6723899f-ecd5-4a9b-abd9-c824d873ae92-kube-api-access-ght6f\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903050 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxlk\" (UniqueName: \"kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbpg\" (UniqueName: \"kubernetes.io/projected/a650250c-ac76-4486-a46d-cd53b83afe05-kube-api-access-cwbpg\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c5d72ddf-8850-405d-aa67-ae759dad90be-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903339 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75ce18b8-5e44-47af-801c-97f9963d1786-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.903369 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaaf56f-0a5d-434c-8bc7-f764c883e204-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnpd\" (UniqueName: \"kubernetes.io/projected/5630f338-fb68-4efc-8eb0-4b2c2fcae913-kube-api-access-plnpd\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904192 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-srv-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/247926dc-e09f-4518-874a-d349acc3c7cd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d94f81-ac48-4829-b0c0-54339568c8f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904318 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5630f338-fb68-4efc-8eb0-4b2c2fcae913-proxy-tls\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904345 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5tx\" (UniqueName: \"kubernetes.io/projected/3fb50200-700a-4355-8e9b-65b2cb24dc02-kube-api-access-qs5tx\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b2ff36-f824-46b5-ad2a-d01710453cb2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904412 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfaaf56f-0a5d-434c-8bc7-f764c883e204-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904437 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a650250c-ac76-4486-a46d-cd53b83afe05-config-volume\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-stats-auth\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462165ba-5266-4604-a449-6dbd4faf67b3-config\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d8d\" (UniqueName: \"kubernetes.io/projected/b62fcd76-8790-4a1e-898d-b9654a876ddc-kube-api-access-x7d8d\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904544 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/462165ba-5266-4604-a449-6dbd4faf67b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904567 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-metrics-certs\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6b8c\" (UniqueName: \"kubernetes.io/projected/0af5bc7b-a373-45fc-b972-17c5a31f317e-kube-api-access-m6b8c\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904617 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppc86\" (UniqueName: \"kubernetes.io/projected/f9d94f81-ac48-4829-b0c0-54339568c8f0-kube-api-access-ppc86\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fb50200-700a-4355-8e9b-65b2cb24dc02-cert\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904666 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-key\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904695 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkb4\" (UniqueName: \"kubernetes.io/projected/247926dc-e09f-4518-874a-d349acc3c7cd-kube-api-access-2kkb4\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904719 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904743 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462165ba-5266-4604-a449-6dbd4faf67b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/abca9390-cbde-4734-910a-433bc590e42a-metrics-tls\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.904985 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d94f81-ac48-4829-b0c0-54339568c8f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905013 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tl2n\" (UniqueName: \"kubernetes.io/projected/75ce18b8-5e44-47af-801c-97f9963d1786-kube-api-access-8tl2n\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905050 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6723899f-ecd5-4a9b-abd9-c824d873ae92-config\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905075 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqqt\" (UniqueName: \"kubernetes.io/projected/a864a900-0b20-49ae-a846-736a9784eee1-kube-api-access-9rqqt\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-serving-cert\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905108 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7ac9e931-216f-42bf-9364-f39b8cbe2b60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905142 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlfm\" (UniqueName: \"kubernetes.io/projected/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-kube-api-access-zrlfm\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcr8\" (UniqueName: \"kubernetes.io/projected/24253089-f34c-4d7a-816f-49c18af92c20-kube-api-access-mxcr8\") pod \"downloads-7954f5f757-ntw62\" (UID: \"24253089-f34c-4d7a-816f-49c18af92c20\") " pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac9e931-216f-42bf-9364-f39b8cbe2b60-serving-cert\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqks8\" (UniqueName: \"kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8\") pod \"auto-csr-approver-29551200-x7tx8\" (UID: \"a013f059-7440-4f06-88f6-a73f3286d228\") " pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrtg\" (UniqueName: \"kubernetes.io/projected/7ac9e931-216f-42bf-9364-f39b8cbe2b60-kube-api-access-pfrtg\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a864a900-0b20-49ae-a846-736a9784eee1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905330 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afa7f33f-fe13-438d-bd39-594f917e6015-tmpfs\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d94f81-ac48-4829-b0c0-54339568c8f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905355 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwwm\" (UniqueName: \"kubernetes.io/projected/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-kube-api-access-fcwwm\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-certs\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjs2\" (UniqueName: \"kubernetes.io/projected/5032087a-e1c3-47cb-9b31-2983fe8aff99-kube-api-access-dbjs2\") pod \"migrator-59844c95c7-4vggg\" (UID: \"5032087a-e1c3-47cb-9b31-2983fe8aff99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09973ee5-21ce-4c4f-b422-dea474d63482-service-ca-bundle\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905527 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-srv-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a650250c-ac76-4486-a46d-cd53b83afe05-metrics-tls\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-apiservice-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg2nm\" (UniqueName: \"kubernetes.io/projected/c5d72ddf-8850-405d-aa67-ae759dad90be-kube-api-access-jg2nm\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905665 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905707 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-socket-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905738 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-default-certificate\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtlp\" (UniqueName: \"kubernetes.io/projected/afa7f33f-fe13-438d-bd39-594f917e6015-kube-api-access-qrtlp\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905793 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-config\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905818 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxhd\" (UniqueName: \"kubernetes.io/projected/09e969ab-296e-4da0-8347-66c9227a149a-kube-api-access-lfxhd\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28f45\" (UniqueName: \"kubernetes.io/projected/e5882f28-881f-4b29-91b1-d2170207949c-kube-api-access-28f45\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqfjh\" (UniqueName: \"kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905896 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-csi-data-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-plugins-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905971 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98b2ff36-f824-46b5-ad2a-d01710453cb2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.905993 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpsm\" (UniqueName: \"kubernetes.io/projected/98b2ff36-f824-46b5-ad2a-d01710453cb2-kube-api-access-xlpsm\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-registration-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6723899f-ecd5-4a9b-abd9-c824d873ae92-serving-cert\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfaaf56f-0a5d-434c-8bc7-f764c883e204-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-mountpoint-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906201 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4km\" (UniqueName: \"kubernetes.io/projected/406f198b-ce58-49b3-a79d-1793c51985fc-kube-api-access-zl4km\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-images\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-profile-collector-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-node-bootstrap-token\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906280 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5xck\" (UniqueName: \"kubernetes.io/projected/09973ee5-21ce-4c4f-b422-dea474d63482-kube-api-access-k5xck\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.906303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-webhook-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.907079 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a864a900-0b20-49ae-a846-736a9784eee1-proxy-tls\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.907679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462165ba-5266-4604-a449-6dbd4faf67b3-config\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.907876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b2ff36-f824-46b5-ad2a-d01710453cb2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.910325 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-serving-cert\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.910565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6723899f-ecd5-4a9b-abd9-c824d873ae92-config\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: E0309 16:01:56.910679 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.410666352 +0000 UTC m=+244.544348775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.912174 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.912514 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d94f81-ac48-4829-b0c0-54339568c8f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.912608 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5630f338-fb68-4efc-8eb0-4b2c2fcae913-proxy-tls\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.913127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.913232 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09e969ab-296e-4da0-8347-66c9227a149a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.913590 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.913664 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.918810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaaf56f-0a5d-434c-8bc7-f764c883e204-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.918902 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.919383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfaaf56f-0a5d-434c-8bc7-f764c883e204-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.919662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a864a900-0b20-49ae-a846-736a9784eee1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.919794 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-metrics-certs\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.919903 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/462165ba-5266-4604-a449-6dbd4faf67b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.920458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-socket-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.920624 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-srv-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.920655 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.921791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09973ee5-21ce-4c4f-b422-dea474d63482-service-ca-bundle\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.921896 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afa7f33f-fe13-438d-bd39-594f917e6015-tmpfs\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.922094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-csi-data-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.922453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/247926dc-e09f-4518-874a-d349acc3c7cd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.922503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-stats-auth\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.922798 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-certs\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.923140 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/abca9390-cbde-4734-910a-433bc590e42a-metrics-tls\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.923178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-plugins-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.923310 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.923650 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-config\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.923828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-mountpoint-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.924960 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.925046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-registration-dir\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.925617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.926194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5630f338-fb68-4efc-8eb0-4b2c2fcae913-images\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.926727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-apiservice-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.926868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09973ee5-21ce-4c4f-b422-dea474d63482-default-certificate\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.927120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afa7f33f-fe13-438d-bd39-594f917e6015-webhook-cert\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.927524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c5d72ddf-8850-405d-aa67-ae759dad90be-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.927609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fb50200-700a-4355-8e9b-65b2cb24dc02-cert\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.933777 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-srv-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.933798 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6723899f-ecd5-4a9b-abd9-c824d873ae92-serving-cert\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.933879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac9e931-216f-42bf-9364-f39b8cbe2b60-serving-cert\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.934092 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/406f198b-ce58-49b3-a79d-1793c51985fc-profile-collector-cert\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.934415 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75ce18b8-5e44-47af-801c-97f9963d1786-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.935837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af5bc7b-a373-45fc-b972-17c5a31f317e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.936274 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.937022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b62fcd76-8790-4a1e-898d-b9654a876ddc-signing-key\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.938391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5882f28-881f-4b29-91b1-d2170207949c-node-bootstrap-token\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.938837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98b2ff36-f824-46b5-ad2a-d01710453cb2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.940500 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.956441 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.957274 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fgls5"] Mar 09 16:01:56 crc kubenswrapper[4831]: W0309 16:01:56.970389 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda18279da_6931_4175_af9d_5fca5a160c91.slice/crio-aee3c1282991aef96a0c2a71eecb051e080b4571e81cdeaa3762efc1305c20a3 WatchSource:0}: Error finding container aee3c1282991aef96a0c2a71eecb051e080b4571e81cdeaa3762efc1305c20a3: Status 404 returned error can't find the container with id aee3c1282991aef96a0c2a71eecb051e080b4571e81cdeaa3762efc1305c20a3 Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.977231 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 16:01:56 crc kubenswrapper[4831]: I0309 16:01:56.997578 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.007510 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.009226 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.509195309 +0000 UTC m=+244.642877802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.009797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.014788 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.020283 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.020692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.021916 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.032859 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a650250c-ac76-4486-a46d-cd53b83afe05-metrics-tls\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.035620 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.038746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.042929 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.044199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a650250c-ac76-4486-a46d-cd53b83afe05-config-volume\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.059374 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.059936 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:01:57 crc kubenswrapper[4831]: W0309 16:01:57.061757 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463e7811_9c30_4106_942b_19deb65f748c.slice/crio-5c1cede33a78486bd611cf4764cfd2902f2b46bdb45358171bc67c0fd9c551a4 WatchSource:0}: Error finding container 5c1cede33a78486bd611cf4764cfd2902f2b46bdb45358171bc67c0fd9c551a4: Status 404 returned error can't find the container with id 5c1cede33a78486bd611cf4764cfd2902f2b46bdb45358171bc67c0fd9c551a4 Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.076960 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.108285 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnsfp"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.110070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.111183 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.611152166 +0000 UTC m=+244.744834579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.134985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4qv\" (UniqueName: \"kubernetes.io/projected/f464ea06-1df1-45a9-9a75-d513ac1de15e-kube-api-access-pv4qv\") pod \"machine-approver-56656f9798-2m8bl\" (UID: \"f464ea06-1df1-45a9-9a75-d513ac1de15e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:57 crc kubenswrapper[4831]: W0309 16:01:57.135232 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a0fb443_0ee4_4505_b781_b14c49d069bf.slice/crio-a3c8511fed4794f8253c641566a33c4f04369478e0244fb7122390ae273a4db1 WatchSource:0}: Error finding container a3c8511fed4794f8253c641566a33c4f04369478e0244fb7122390ae273a4db1: Status 404 returned error can't find the container with id a3c8511fed4794f8253c641566a33c4f04369478e0244fb7122390ae273a4db1 Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.158449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.176694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9dgq\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.199927 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.212998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.213607 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.713582966 +0000 UTC m=+244.847265379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.215279 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfh5\" (UniqueName: \"kubernetes.io/projected/642242a2-404d-4008-aacf-ebb38010d636-kube-api-access-8hfh5\") pod \"console-f9d7485db-wgqf6\" (UID: \"642242a2-404d-4008-aacf-ebb38010d636\") " pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.234972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7r2x\" (UniqueName: \"kubernetes.io/projected/a241b4f4-f7eb-4aac-8a23-cd8deeed49c0-kube-api-access-r7r2x\") pod \"ingress-operator-5b745b69d9-mlrcf\" (UID: \"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.248623 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2nnxk"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.254832 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dg7\" (UniqueName: \"kubernetes.io/projected/abca9390-cbde-4734-910a-433bc590e42a-kube-api-access-27dg7\") pod \"dns-operator-744455d44c-5gntv\" (UID: \"abca9390-cbde-4734-910a-433bc590e42a\") " pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.281924 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppc86\" (UniqueName: \"kubernetes.io/projected/f9d94f81-ac48-4829-b0c0-54339568c8f0-kube-api-access-ppc86\") pod \"kube-storage-version-migrator-operator-b67b599dd-npdkz\" (UID: \"f9d94f81-ac48-4829-b0c0-54339568c8f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.289772 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.293184 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.295207 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwwm\" (UniqueName: \"kubernetes.io/projected/1a3ad7b1-fa93-4b88-bd28-8aaea19d6763-kube-api-access-fcwwm\") pod \"csi-hostpathplugin-v7slb\" (UID: \"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763\") " pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.307202 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.315099 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.315804 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.815789499 +0000 UTC m=+244.949471922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.316678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/462165ba-5266-4604-a449-6dbd4faf67b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vqjxn\" (UID: \"462165ba-5266-4604-a449-6dbd4faf67b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.341731 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.347658 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d8d\" (UniqueName: \"kubernetes.io/projected/b62fcd76-8790-4a1e-898d-b9654a876ddc-kube-api-access-x7d8d\") pod \"service-ca-9c57cc56f-fcd87\" (UID: \"b62fcd76-8790-4a1e-898d-b9654a876ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.350958 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5tx\" (UniqueName: \"kubernetes.io/projected/3fb50200-700a-4355-8e9b-65b2cb24dc02-kube-api-access-qs5tx\") pod \"ingress-canary-2259m\" (UID: \"3fb50200-700a-4355-8e9b-65b2cb24dc02\") " pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:57 crc kubenswrapper[4831]: W0309 16:01:57.394509 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870c85dc_0d14_4681_be3c_9e87bee849d8.slice/crio-835ee5815a765827b3a39c84300d7afe05c4d5597b138bedd104ed5b1b26096a WatchSource:0}: Error finding container 835ee5815a765827b3a39c84300d7afe05c4d5597b138bedd104ed5b1b26096a: Status 404 returned error can't find the container with id 835ee5815a765827b3a39c84300d7afe05c4d5597b138bedd104ed5b1b26096a Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.398694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqqt\" (UniqueName: \"kubernetes.io/projected/a864a900-0b20-49ae-a846-736a9784eee1-kube-api-access-9rqqt\") pod \"machine-config-controller-84d6567774-wk2hz\" (UID: \"a864a900-0b20-49ae-a846-736a9784eee1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.400382 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tl2n\" (UniqueName: \"kubernetes.io/projected/75ce18b8-5e44-47af-801c-97f9963d1786-kube-api-access-8tl2n\") pod \"control-plane-machine-set-operator-78cbb6b69f-h96p9\" (UID: \"75ce18b8-5e44-47af-801c-97f9963d1786\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.404638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.415815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.415948 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.416144 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.916120964 +0000 UTC m=+245.049803387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.417302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.417619 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:57.917606683 +0000 UTC m=+245.051289106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.419442 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkb4\" (UniqueName: \"kubernetes.io/projected/247926dc-e09f-4518-874a-d349acc3c7cd-kube-api-access-2kkb4\") pod \"package-server-manager-789f6589d5-5f6d8\" (UID: \"247926dc-e09f-4518-874a-d349acc3c7cd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.425974 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.437797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght6f\" (UniqueName: \"kubernetes.io/projected/6723899f-ecd5-4a9b-abd9-c824d873ae92-kube-api-access-ght6f\") pod \"service-ca-operator-777779d784-xvs22\" (UID: \"6723899f-ecd5-4a9b-abd9-c824d873ae92\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.457700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxlk\" (UniqueName: \"kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk\") pod \"collect-profiles-29551200-5bml6\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.463607 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.472208 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.482723 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbpg\" (UniqueName: \"kubernetes.io/projected/a650250c-ac76-4486-a46d-cd53b83afe05-kube-api-access-cwbpg\") pod \"dns-default-z4ljk\" (UID: \"a650250c-ac76-4486-a46d-cd53b83afe05\") " pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.492219 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.514126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnpd\" (UniqueName: \"kubernetes.io/projected/5630f338-fb68-4efc-8eb0-4b2c2fcae913-kube-api-access-plnpd\") pod \"machine-config-operator-74547568cd-ps5sb\" (UID: \"5630f338-fb68-4efc-8eb0-4b2c2fcae913\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.518613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.519786 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.019766926 +0000 UTC m=+245.153449349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.522993 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.528883 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6b8c\" (UniqueName: \"kubernetes.io/projected/0af5bc7b-a373-45fc-b972-17c5a31f317e-kube-api-access-m6b8c\") pod \"olm-operator-6b444d44fb-ntzjm\" (UID: \"0af5bc7b-a373-45fc-b972-17c5a31f317e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.543919 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lg8th"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.546154 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.554567 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.556184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqks8\" (UniqueName: \"kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8\") pod \"auto-csr-approver-29551200-x7tx8\" (UID: \"a013f059-7440-4f06-88f6-a73f3286d228\") " pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.558623 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.559433 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlfm\" (UniqueName: \"kubernetes.io/projected/afa31e98-60fb-4eb4-8090-e9daf55a5c6c-kube-api-access-zrlfm\") pod \"authentication-operator-69f744f599-ln52t\" (UID: \"afa31e98-60fb-4eb4-8090-e9daf55a5c6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.573411 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.592311 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.598107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjs2\" (UniqueName: \"kubernetes.io/projected/5032087a-e1c3-47cb-9b31-2983fe8aff99-kube-api-access-dbjs2\") pod \"migrator-59844c95c7-4vggg\" (UID: \"5032087a-e1c3-47cb-9b31-2983fe8aff99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.599585 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2259m" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.619467 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcr8\" (UniqueName: \"kubernetes.io/projected/24253089-f34c-4d7a-816f-49c18af92c20-kube-api-access-mxcr8\") pod \"downloads-7954f5f757-ntw62\" (UID: \"24253089-f34c-4d7a-816f-49c18af92c20\") " pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.619862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z4ljk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.624940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.625443 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.12541524 +0000 UTC m=+245.259097733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.631702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrtg\" (UniqueName: \"kubernetes.io/projected/7ac9e931-216f-42bf-9364-f39b8cbe2b60-kube-api-access-pfrtg\") pod \"openshift-config-operator-7777fb866f-tbp9n\" (UID: \"7ac9e931-216f-42bf-9364-f39b8cbe2b60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.643030 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg2nm\" (UniqueName: \"kubernetes.io/projected/c5d72ddf-8850-405d-aa67-ae759dad90be-kube-api-access-jg2nm\") pod \"multus-admission-controller-857f4d67dd-2lkkk\" (UID: \"c5d72ddf-8850-405d-aa67-ae759dad90be\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.648258 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.658370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpsm\" (UniqueName: \"kubernetes.io/projected/98b2ff36-f824-46b5-ad2a-d01710453cb2-kube-api-access-xlpsm\") pod \"openshift-apiserver-operator-796bbdcf4f-nw95b\" (UID: \"98b2ff36-f824-46b5-ad2a-d01710453cb2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.658611 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.667012 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.670443 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5gntv"] Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.673077 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.682972 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.684123 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28f45\" (UniqueName: \"kubernetes.io/projected/e5882f28-881f-4b29-91b1-d2170207949c-kube-api-access-28f45\") pod \"machine-config-server-4k4zw\" (UID: \"e5882f28-881f-4b29-91b1-d2170207949c\") " pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.690933 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.696106 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtlp\" (UniqueName: \"kubernetes.io/projected/afa7f33f-fe13-438d-bd39-594f917e6015-kube-api-access-qrtlp\") pod \"packageserver-d55dfcdfc-x9895\" (UID: \"afa7f33f-fe13-438d-bd39-594f917e6015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.712170 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqfjh\" (UniqueName: \"kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh\") pod \"marketplace-operator-79b997595-lv25s\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.726881 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.727065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.727230 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.227212463 +0000 UTC m=+245.360894886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.727409 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.727711 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.227703756 +0000 UTC m=+245.361386179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.732104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxhd\" (UniqueName: \"kubernetes.io/projected/09e969ab-296e-4da0-8347-66c9227a149a-kube-api-access-lfxhd\") pod \"cluster-samples-operator-665b6dd947-vctbp\" (UID: \"09e969ab-296e-4da0-8347-66c9227a149a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.738657 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.755776 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfaaf56f-0a5d-434c-8bc7-f764c883e204-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qwsf\" (UID: \"bfaaf56f-0a5d-434c-8bc7-f764c883e204\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:57 crc kubenswrapper[4831]: W0309 16:01:57.774068 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314f92cb_af76_454e_b67c_f056477de5e9.slice/crio-975d54a3797888f2c685907b66cfeb63881d0a839a54b0f3f55000642e776416 WatchSource:0}: Error finding container 975d54a3797888f2c685907b66cfeb63881d0a839a54b0f3f55000642e776416: Status 404 returned error can't find the container with id 975d54a3797888f2c685907b66cfeb63881d0a839a54b0f3f55000642e776416 Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.774596 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4km\" (UniqueName: \"kubernetes.io/projected/406f198b-ce58-49b3-a79d-1793c51985fc-kube-api-access-zl4km\") pod \"catalog-operator-68c6474976-nmvzm\" (UID: \"406f198b-ce58-49b3-a79d-1793c51985fc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.783445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.802441 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.813096 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.815644 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5xck\" (UniqueName: \"kubernetes.io/projected/09973ee5-21ce-4c4f-b422-dea474d63482-kube-api-access-k5xck\") pod \"router-default-5444994796-q7h8z\" (UID: \"09973ee5-21ce-4c4f-b422-dea474d63482\") " pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.834880 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.835334 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.335314613 +0000 UTC m=+245.468997036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.836119 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.839625 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4k4zw" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.854762 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" event={"ID":"f07d0628-187a-492f-8dee-f1e28ba448cb","Type":"ContainerStarted","Data":"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.854802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" event={"ID":"f07d0628-187a-492f-8dee-f1e28ba448cb","Type":"ContainerStarted","Data":"f4af8cdfadf7295f16f6f24b8fdd1c2cb15c425504cef1d89350ddaa74e6ebd8"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.855664 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.856822 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" event={"ID":"314f92cb-af76-454e-b67c-f056477de5e9","Type":"ContainerStarted","Data":"975d54a3797888f2c685907b66cfeb63881d0a839a54b0f3f55000642e776416"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.857621 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" event={"ID":"f464ea06-1df1-45a9-9a75-d513ac1de15e","Type":"ContainerStarted","Data":"2f494513e7fce2e781706d33a54da075fe75a936d26447687da1f3fb51acb3f7"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.858892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" event={"ID":"62d263d2-2a73-401e-9922-59d5939d6b24","Type":"ContainerStarted","Data":"7223fcc64e054d227dc6de10ef1a0510f69987802eece4c67529a33bd7e7e2dc"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.860171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" event={"ID":"f9b292d9-3fc4-40cb-a74b-00c849999f8c","Type":"ContainerStarted","Data":"c7822baeeee23d8bf43ebddfd124c7cfc6d86aeb3fea299bc6a3f2e5213381f8"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.860221 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" event={"ID":"f9b292d9-3fc4-40cb-a74b-00c849999f8c","Type":"ContainerStarted","Data":"4e5b6768df199c0253e8c91dd1f1d51b84bc3f43115800cd68ad3ed5e72c64ca"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.861328 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" event={"ID":"1a0fb443-0ee4-4505-b781-b14c49d069bf","Type":"ContainerStarted","Data":"9d9b85f1ac89e10931eafad62dd966b09c596a963ea16bd930432c903f31d765"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.861379 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" event={"ID":"1a0fb443-0ee4-4505-b781-b14c49d069bf","Type":"ContainerStarted","Data":"a3c8511fed4794f8253c641566a33c4f04369478e0244fb7122390ae273a4db1"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.862636 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.863773 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" event={"ID":"850f7c83-ddcb-4209-84d7-27c63c8b3e1c","Type":"ContainerStarted","Data":"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.863791 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" event={"ID":"850f7c83-ddcb-4209-84d7-27c63c8b3e1c","Type":"ContainerStarted","Data":"6f9f16bba3afefc953799c7df2bc2cef29f1775c25dfb1874981337f8f2ca6b0"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.864488 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.869915 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.890442 4831 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7c7t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.890517 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.890856 4831 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-k4n9g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.890917 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.892453 4831 patch_prober.go:28] interesting pod/console-operator-58897d9998-jnsfp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.892486 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" podUID="1a0fb443-0ee4-4505-b781-b14c49d069bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.940216 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" event={"ID":"463e7811-9c30-4106-942b-19deb65f748c","Type":"ContainerStarted","Data":"036159e4ce7e8fb1ac6e3964f9a878e406c963e49bb132ebc000eebe164abc6e"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.940275 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" event={"ID":"463e7811-9c30-4106-942b-19deb65f748c","Type":"ContainerStarted","Data":"5c1cede33a78486bd611cf4764cfd2902f2b46bdb45358171bc67c0fd9c551a4"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.941071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:57 crc kubenswrapper[4831]: E0309 16:01:57.941363 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.441346908 +0000 UTC m=+245.575029391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.973619 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" event={"ID":"abca9390-cbde-4734-910a-433bc590e42a","Type":"ContainerStarted","Data":"a5d01edc3810ad94187d10771d3043395d063187cfaa05ccd9d2e483c4e2ec6f"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.990558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" event={"ID":"870c85dc-0d14-4681-be3c-9e87bee849d8","Type":"ContainerStarted","Data":"1532dff4ad49c48f6553338290802d9540ba3d3a06cc87dc8830bd72825104e8"} Mar 09 16:01:57 crc kubenswrapper[4831]: I0309 16:01:57.990596 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" event={"ID":"870c85dc-0d14-4681-be3c-9e87bee849d8","Type":"ContainerStarted","Data":"835ee5815a765827b3a39c84300d7afe05c4d5597b138bedd104ed5b1b26096a"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.001888 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.004556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" event={"ID":"a18279da-6931-4175-af9d-5fca5a160c91","Type":"ContainerStarted","Data":"0780addb2051200cdcb10fffd3d89a25877bdbc1d8595eecd0419f4f3e94e942"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.004602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" event={"ID":"a18279da-6931-4175-af9d-5fca5a160c91","Type":"ContainerStarted","Data":"aee3c1282991aef96a0c2a71eecb051e080b4571e81cdeaa3762efc1305c20a3"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.022953 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" event={"ID":"bad40e77-ebaa-48c9-a463-b2e821fbe30f","Type":"ContainerStarted","Data":"f261441a83c038fcf6d58f09b4ae3dd06c8f7512b106085e9b58bdab2d4b7cfa"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.023020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" event={"ID":"bad40e77-ebaa-48c9-a463-b2e821fbe30f","Type":"ContainerStarted","Data":"20d420df9142982d79c5ee45ad0b89508b6001e47a08e917cc5916a73da365f7"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.024117 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf"] Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.028720 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" event={"ID":"62d1a4be-a162-466f-b579-247a86379faa","Type":"ContainerStarted","Data":"0c4ff45242da52b6df31ac2222e1c7d28ea1abcf97edf96d37c8284648f611cf"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.028777 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" event={"ID":"62d1a4be-a162-466f-b579-247a86379faa","Type":"ContainerStarted","Data":"e0ab32144cb9d58a423422329e692e814c5dca299b4c7a1fc34065289c41d75f"} Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.029590 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.034158 4831 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2nnxk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.034271 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" podUID="62d1a4be-a162-466f-b579-247a86379faa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.042955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.045906 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.545877083 +0000 UTC m=+245.679559676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.051426 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.069066 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8"] Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.145333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.159495 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.659472208 +0000 UTC m=+245.793154631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.172196 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7slb"] Mar 09 16:01:58 crc kubenswrapper[4831]: W0309 16:01:58.221484 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda241b4f4_f7eb_4aac_8a23_cd8deeed49c0.slice/crio-7adf65b47d2d7b22c3204a5d127f090569c02202fb8994e8941fa7a238382ffc WatchSource:0}: Error finding container 7adf65b47d2d7b22c3204a5d127f090569c02202fb8994e8941fa7a238382ffc: Status 404 returned error can't find the container with id 7adf65b47d2d7b22c3204a5d127f090569c02202fb8994e8941fa7a238382ffc Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.259184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.259993 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.759568736 +0000 UTC m=+245.893251159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.363360 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.366670 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.866647589 +0000 UTC m=+246.000330012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.472554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.472704 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.972671554 +0000 UTC m=+246.106353977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.472989 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.473314 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:58.973298241 +0000 UTC m=+246.106980664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.571997 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" podStartSLOduration=176.571968441 podStartE2EDuration="2m56.571968441s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:58.570646906 +0000 UTC m=+245.704329329" watchObservedRunningTime="2026-03-09 16:01:58.571968441 +0000 UTC m=+245.705650864" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.574341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.574835 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.074819816 +0000 UTC m=+246.208502229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.679522 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.680438 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.18042082 +0000 UTC m=+246.314103243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.781701 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.782278 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.282216923 +0000 UTC m=+246.415899346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.834464 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" podStartSLOduration=177.834434994 podStartE2EDuration="2m57.834434994s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:58.817502296 +0000 UTC m=+245.951184719" watchObservedRunningTime="2026-03-09 16:01:58.834434994 +0000 UTC m=+245.968117427" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.883242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.883630 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.383617555 +0000 UTC m=+246.517299978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.953554 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2jn8" podStartSLOduration=177.953522014 podStartE2EDuration="2m57.953522014s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:58.951935372 +0000 UTC m=+246.085617795" watchObservedRunningTime="2026-03-09 16:01:58.953522014 +0000 UTC m=+246.087204437" Mar 09 16:01:58 crc kubenswrapper[4831]: I0309 16:01:58.986482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:58 crc kubenswrapper[4831]: E0309 16:01:58.986829 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.486815325 +0000 UTC m=+246.620497748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.105116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.105550 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.605533486 +0000 UTC m=+246.739215899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.184292 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4k4zw" event={"ID":"e5882f28-881f-4b29-91b1-d2170207949c","Type":"ContainerStarted","Data":"0466df6ef114592e3e19e3d48b9c542532a88c97b243c5aa6828c7bab88c5607"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.202325 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" event={"ID":"247926dc-e09f-4518-874a-d349acc3c7cd","Type":"ContainerStarted","Data":"596bfaf2f2ce13d32a18365681725ce1c0a47a60555168dc37a1cbdbeff47ee1"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.206420 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.206755 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.706739423 +0000 UTC m=+246.840421846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.221673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" event={"ID":"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0","Type":"ContainerStarted","Data":"7adf65b47d2d7b22c3204a5d127f090569c02202fb8994e8941fa7a238382ffc"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.229652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" event={"ID":"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763","Type":"ContainerStarted","Data":"3feb1d1640ede4249095ce080cb43572919be5985603e095298e64c078a1f16c"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.240133 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q7h8z" event={"ID":"09973ee5-21ce-4c4f-b422-dea474d63482","Type":"ContainerStarted","Data":"17ea6f38d82d0cbecf44890f648a0fd73d3165a3d1887300c0d2de81c21067d4"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.244004 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" event={"ID":"314f92cb-af76-454e-b67c-f056477de5e9","Type":"ContainerStarted","Data":"1f2b08bb1566b8ffe4cda5b8ab972a04540a25762ed7076a5a8457ae8a53f590"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.249295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" event={"ID":"f464ea06-1df1-45a9-9a75-d513ac1de15e","Type":"ContainerStarted","Data":"58b7d31ac9085fb25cfd7ee4cc64811433c20b64b5b1284efc3e9847e69a1380"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.251130 4831 generic.go:334] "Generic (PLEG): container finished" podID="62d263d2-2a73-401e-9922-59d5939d6b24" containerID="d5d440c0c7e1aa315f7603f0767dd5444822c3649baf7fe220ac7b9b0dc7810e" exitCode=0 Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.255361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" event={"ID":"62d263d2-2a73-401e-9922-59d5939d6b24","Type":"ContainerDied","Data":"d5d440c0c7e1aa315f7603f0767dd5444822c3649baf7fe220ac7b9b0dc7810e"} Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.310587 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.316506 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.816491096 +0000 UTC m=+246.950173519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.370067 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" podStartSLOduration=178.370047473 podStartE2EDuration="2m58.370047473s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.368004129 +0000 UTC m=+246.501686552" watchObservedRunningTime="2026-03-09 16:01:59.370047473 +0000 UTC m=+246.503729896" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.412718 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.412972 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:01:59.912956798 +0000 UTC m=+247.046639221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.414918 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" podStartSLOduration=178.41489891 podStartE2EDuration="2m58.41489891s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.412824705 +0000 UTC m=+246.546507118" watchObservedRunningTime="2026-03-09 16:01:59.41489891 +0000 UTC m=+246.548581333" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.514589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.514969 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.014955357 +0000 UTC m=+247.148637780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.533953 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rmhsf" podStartSLOduration=178.533932829 podStartE2EDuration="2m58.533932829s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.533692622 +0000 UTC m=+246.667375045" watchObservedRunningTime="2026-03-09 16:01:59.533932829 +0000 UTC m=+246.667615252" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.613658 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fgls5" podStartSLOduration=178.613641827 podStartE2EDuration="2m58.613641827s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.589273713 +0000 UTC m=+246.722956136" watchObservedRunningTime="2026-03-09 16:01:59.613641827 +0000 UTC m=+246.747324250" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.615965 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.616313 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.116302818 +0000 UTC m=+247.249985241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.676322 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.689569 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" podStartSLOduration=178.689550855 podStartE2EDuration="2m58.689550855s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.687270185 +0000 UTC m=+246.820952608" watchObservedRunningTime="2026-03-09 16:01:59.689550855 +0000 UTC m=+246.823233278" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.717828 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.718204 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.218187093 +0000 UTC m=+247.351869516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.798475 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4j68p" podStartSLOduration=178.798456796 podStartE2EDuration="2m58.798456796s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:01:59.797967733 +0000 UTC m=+246.931650156" watchObservedRunningTime="2026-03-09 16:01:59.798456796 +0000 UTC m=+246.932139219" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.819061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.819580 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.319538824 +0000 UTC m=+247.453221247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.847736 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.871281 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm"] Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.907699 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2259m"] Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.928547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:01:59 crc kubenswrapper[4831]: E0309 16:01:59.928934 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.428923098 +0000 UTC m=+247.562605521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.929529 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvs22"] Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.955988 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn"] Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.988791 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:01:59 crc kubenswrapper[4831]: I0309 16:01:59.995738 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fcd87"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.029188 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wgqf6"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.032501 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.032845 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.532830057 +0000 UTC m=+247.666512480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: W0309 16:02:00.038871 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6723899f_ecd5_4a9b_abd9_c824d873ae92.slice/crio-c083aabd052c49438bdf7fe666d1d410cbb303a047617a38438a423577109bb8 WatchSource:0}: Error finding container c083aabd052c49438bdf7fe666d1d410cbb303a047617a38438a423577109bb8: Status 404 returned error can't find the container with id c083aabd052c49438bdf7fe666d1d410cbb303a047617a38438a423577109bb8 Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.075764 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.092749 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.116721 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz"] Mar 09 16:02:00 crc kubenswrapper[4831]: W0309 16:02:00.122309 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642242a2_404d_4008_aacf_ebb38010d636.slice/crio-9f02438302ebfd656adfdd7c201e1e56cb2e95a642813893f14bd8f6ecedb21d WatchSource:0}: Error finding container 9f02438302ebfd656adfdd7c201e1e56cb2e95a642813893f14bd8f6ecedb21d: Status 404 returned error can't find the container with id 9f02438302ebfd656adfdd7c201e1e56cb2e95a642813893f14bd8f6ecedb21d Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.132448 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.134688 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.134959 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.634948188 +0000 UTC m=+247.768630611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.154765 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ntw62"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.158786 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.176226 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jnsfp" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.179883 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lkkk"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.192096 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.210510 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551200-x7tx8"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.216569 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.236528 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ln52t"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.240969 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.241258 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.74124244 +0000 UTC m=+247.874924863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.269265 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551202-rgtkz"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.270204 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.288221 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551202-rgtkz"] Mar 09 16:02:00 crc kubenswrapper[4831]: W0309 16:02:00.312873 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda013f059_7440_4f06_88f6_a73f3286d228.slice/crio-b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9 WatchSource:0}: Error finding container b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9: Status 404 returned error can't find the container with id b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9 Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.313006 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.361239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fjm\" (UniqueName: \"kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm\") pod \"auto-csr-approver-29551202-rgtkz\" (UID: \"33678e26-b1b2-419f-93ef-85ba9e935155\") " pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.361345 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.361684 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.861672306 +0000 UTC m=+247.995354729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.364274 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.374024 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" event={"ID":"314f92cb-af76-454e-b67c-f056477de5e9","Type":"ContainerStarted","Data":"474f1f0a3bfa0ddb0cd9320302a41fe4cf381afb46c3f85bcd085ce5e13d6900"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.379642 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.380159 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.410288 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.416059 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.434699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" event={"ID":"3c796eb3-c14d-48be-882d-5ae13e12918a","Type":"ContainerStarted","Data":"b43a6697ed58c8d725a21a09f7041217cdf72beb807914f1fcdde3fae3c8f7f2"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.440462 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z4ljk"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.461937 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.462336 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.462934 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.962904203 +0000 UTC m=+248.096586636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.475940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895"] Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.481028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" event={"ID":"0af5bc7b-a373-45fc-b972-17c5a31f317e","Type":"ContainerStarted","Data":"c8c455bd47bd425fde88e6d3b972dfa87c6cbc9e4b855b0b60280e9cd6b3ba16"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.483783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fjm\" (UniqueName: \"kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm\") pod \"auto-csr-approver-29551202-rgtkz\" (UID: \"33678e26-b1b2-419f-93ef-85ba9e935155\") " pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.483935 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.485001 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:00.984989377 +0000 UTC m=+248.118671800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.515837 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lg8th" podStartSLOduration=179.515822483 podStartE2EDuration="2m59.515822483s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.514836597 +0000 UTC m=+247.648519020" watchObservedRunningTime="2026-03-09 16:02:00.515822483 +0000 UTC m=+247.649504906" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.525121 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.525191 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.528542 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" event={"ID":"247926dc-e09f-4518-874a-d349acc3c7cd","Type":"ContainerStarted","Data":"011e364e8e8ba013ff986059d8cfe39e9ec1233818f633e0620023f3356ff0f9"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.528588 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" event={"ID":"247926dc-e09f-4518-874a-d349acc3c7cd","Type":"ContainerStarted","Data":"624ab93f24aee415ab49db6df4f17be2286bc1ff5abf1b4d21cfe4894a5b8ddc"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.528626 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.540143 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fjm\" (UniqueName: \"kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm\") pod \"auto-csr-approver-29551202-rgtkz\" (UID: \"33678e26-b1b2-419f-93ef-85ba9e935155\") " pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.575630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" event={"ID":"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0","Type":"ContainerStarted","Data":"9e6fdee72a630326936e7cc6ab2c8b355db484c16dadff955249a03242dbb4da"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.575703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" event={"ID":"a241b4f4-f7eb-4aac-8a23-cd8deeed49c0","Type":"ContainerStarted","Data":"017b83b3d79e3ab93ad1f27e5c1964331830441e7645f39162cf6ad0ac8bd66d"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.576286 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" podStartSLOduration=178.576263592 podStartE2EDuration="2m58.576263592s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.575035869 +0000 UTC m=+247.708718312" watchObservedRunningTime="2026-03-09 16:02:00.576263592 +0000 UTC m=+247.709946015" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.580425 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ntw62" event={"ID":"24253089-f34c-4d7a-816f-49c18af92c20","Type":"ContainerStarted","Data":"e977b8d66092941c4e11301aba041d056d4591e5e088c7e4df6bbc75afb0dbf1"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.581834 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" event={"ID":"a864a900-0b20-49ae-a846-736a9784eee1","Type":"ContainerStarted","Data":"2cbec6b659847ab16bdad898a220e373f3fcf6424dfb348764f7bac02f0cb6da"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.585711 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.586239 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.086216575 +0000 UTC m=+248.219898998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.612795 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" event={"ID":"462165ba-5266-4604-a449-6dbd4faf67b3","Type":"ContainerStarted","Data":"87273502621b125c05a07fd78c0911c2cb75e76bb2100660451c3181b7573192"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.630846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgqf6" event={"ID":"642242a2-404d-4008-aacf-ebb38010d636","Type":"ContainerStarted","Data":"9f02438302ebfd656adfdd7c201e1e56cb2e95a642813893f14bd8f6ecedb21d"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.632340 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" event={"ID":"5630f338-fb68-4efc-8eb0-4b2c2fcae913","Type":"ContainerStarted","Data":"a51d29f2c46bde8ae25c67693104f807c2151af42f953a53e9c82bf0154b429d"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.644244 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mlrcf" podStartSLOduration=179.64421995 podStartE2EDuration="2m59.64421995s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.641235141 +0000 UTC m=+247.774917564" watchObservedRunningTime="2026-03-09 16:02:00.64421995 +0000 UTC m=+247.777902373" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.646237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" event={"ID":"b62fcd76-8790-4a1e-898d-b9654a876ddc","Type":"ContainerStarted","Data":"6cb215a4b6ae5c9e1a5a6c6d34fa6fa1b8460b141cf18bcc2d55f043438e011b"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.688929 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.689447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.691043 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.191022168 +0000 UTC m=+248.324704661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.711778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" event={"ID":"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763","Type":"ContainerStarted","Data":"bee534a5fd3d2de369f3bab5bd611cd2c4a4c20de3a8f543209f51912fb525a5"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.720469 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q7h8z" event={"ID":"09973ee5-21ce-4c4f-b422-dea474d63482","Type":"ContainerStarted","Data":"bb173ee340f5f0f12ede29336487e88ac1bb7006af70171f3b7239ceda6e4677"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.749663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" event={"ID":"6723899f-ecd5-4a9b-abd9-c824d873ae92","Type":"ContainerStarted","Data":"c083aabd052c49438bdf7fe666d1d410cbb303a047617a38438a423577109bb8"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.761965 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-q7h8z" podStartSLOduration=179.761943174 podStartE2EDuration="2m59.761943174s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.760075214 +0000 UTC m=+247.893757637" watchObservedRunningTime="2026-03-09 16:02:00.761943174 +0000 UTC m=+247.895625597" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.762130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" event={"ID":"abca9390-cbde-4734-910a-433bc590e42a","Type":"ContainerStarted","Data":"f6ebee3fa0eb875c7ec36226439aa33254d58d4ce1916a9fb47646d097f41e35"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.775996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" event={"ID":"f464ea06-1df1-45a9-9a75-d513ac1de15e","Type":"ContainerStarted","Data":"53fdb25cc25044cf9ca7949d92511eecd11e31b9676ca2c00f755bc94bb968cf"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.787807 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" event={"ID":"62d263d2-2a73-401e-9922-59d5939d6b24","Type":"ContainerStarted","Data":"cb860e7ea747abb52d46acf97bf04436b0014a32379c33c53e52ee32abb39ae4"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.806507 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.808004 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.307985252 +0000 UTC m=+248.441667675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.814460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4k4zw" event={"ID":"e5882f28-881f-4b29-91b1-d2170207949c","Type":"ContainerStarted","Data":"fb52efec27a4e876a88f50f6d75904a9e99ec30465e2cfbf47b0460bd827517e"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.838693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2259m" event={"ID":"3fb50200-700a-4355-8e9b-65b2cb24dc02","Type":"ContainerStarted","Data":"41b9bbc9c37fad930a09fa607a677d8b14d4fd09421d28354422a8b5a46f356f"} Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.852979 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2m8bl" podStartSLOduration=179.852942261 podStartE2EDuration="2m59.852942261s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.850970439 +0000 UTC m=+247.984652862" watchObservedRunningTime="2026-03-09 16:02:00.852942261 +0000 UTC m=+247.986624684" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.853523 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" podStartSLOduration=179.853516076 podStartE2EDuration="2m59.853516076s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.81472328 +0000 UTC m=+247.948405703" watchObservedRunningTime="2026-03-09 16:02:00.853516076 +0000 UTC m=+247.987198499" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.890020 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4k4zw" podStartSLOduration=6.890000771 podStartE2EDuration="6.890000771s" podCreationTimestamp="2026-03-09 16:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.888836031 +0000 UTC m=+248.022518454" watchObservedRunningTime="2026-03-09 16:02:00.890000771 +0000 UTC m=+248.023683204" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.910332 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:00 crc kubenswrapper[4831]: E0309 16:02:00.910559 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.410530475 +0000 UTC m=+248.544212898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.988009 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2259m" podStartSLOduration=6.987987914 podStartE2EDuration="6.987987914s" podCreationTimestamp="2026-03-09 16:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.986989917 +0000 UTC m=+248.120672330" watchObservedRunningTime="2026-03-09 16:02:00.987987914 +0000 UTC m=+248.121670337" Mar 09 16:02:00 crc kubenswrapper[4831]: I0309 16:02:00.994215 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" podStartSLOduration=178.994190408 podStartE2EDuration="2m58.994190408s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:00.916091302 +0000 UTC m=+248.049773725" watchObservedRunningTime="2026-03-09 16:02:00.994190408 +0000 UTC m=+248.127872831" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.005451 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.012671 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.014155 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.514132205 +0000 UTC m=+248.647814628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.015641 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:01 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:01 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:01 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.016799 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.087576 4831 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rd78d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]log ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]etcd ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/max-in-flight-filter ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 16:02:01 crc kubenswrapper[4831]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 16:02:01 crc kubenswrapper[4831]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-startinformers ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 09 16:02:01 crc kubenswrapper[4831]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 16:02:01 crc kubenswrapper[4831]: livez check failed Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.087638 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" podUID="bad40e77-ebaa-48c9-a463-b2e821fbe30f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.114088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.114522 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.614510141 +0000 UTC m=+248.748192554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.215277 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.216008 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.715992055 +0000 UTC m=+248.849674478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.325997 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.326299 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.826259922 +0000 UTC m=+248.959942345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.427057 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.427320 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.927297005 +0000 UTC m=+249.060979418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.427758 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.428128 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:01.928114127 +0000 UTC m=+249.061796550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.528220 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.528903 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.028886722 +0000 UTC m=+249.162569145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.633633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.633917 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.133905761 +0000 UTC m=+249.267588184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.680645 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55088: no serving certificate available for the kubelet" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.743462 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.744245 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.244228239 +0000 UTC m=+249.377910662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.749652 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551202-rgtkz"] Mar 09 16:02:01 crc kubenswrapper[4831]: W0309 16:02:01.837320 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33678e26_b1b2_419f_93ef_85ba9e935155.slice/crio-9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46 WatchSource:0}: Error finding container 9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46: Status 404 returned error can't find the container with id 9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46 Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.848213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.848618 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.34860238 +0000 UTC m=+249.482284803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.879914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" event={"ID":"afa7f33f-fe13-438d-bd39-594f917e6015","Type":"ContainerStarted","Data":"24a684146ee851c8cc0e7b4e6d0faf09bd7a42870cddd1f398dccc41c50c5017"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.902116 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55102: no serving certificate available for the kubelet" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.918706 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" event={"ID":"3c796eb3-c14d-48be-882d-5ae13e12918a","Type":"ContainerStarted","Data":"249ef961987b962ba9f3fae85bcca9a7590a4d1db9e57ce72496eab9b129f4dc"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.934793 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" event={"ID":"0af5bc7b-a373-45fc-b972-17c5a31f317e","Type":"ContainerStarted","Data":"a450016c2b474e7677a3ec4d364b8bcba730b3db69803551e9ff1a9fbd71f0e8"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.935939 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.938154 4831 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ntzjm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.938219 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" podUID="0af5bc7b-a373-45fc-b972-17c5a31f317e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.946713 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" event={"ID":"c5d72ddf-8850-405d-aa67-ae759dad90be","Type":"ContainerStarted","Data":"0191b93a1449581cadae1274f0130c056d5bcd2cd62cee7835de04effcb3394c"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.950296 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:01 crc kubenswrapper[4831]: E0309 16:02:01.950679 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.45066479 +0000 UTC m=+249.584347213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.958247 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" podStartSLOduration=121.95822269 podStartE2EDuration="2m1.95822269s" podCreationTimestamp="2026-03-09 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:01.956815443 +0000 UTC m=+249.090497866" watchObservedRunningTime="2026-03-09 16:02:01.95822269 +0000 UTC m=+249.091905113" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.964680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgqf6" event={"ID":"642242a2-404d-4008-aacf-ebb38010d636","Type":"ContainerStarted","Data":"b0766bff2e3498d47f2b2f299061d383c5510ba37127daff15d8e9f6a4afeb2a"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.975675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" event={"ID":"406f198b-ce58-49b3-a79d-1793c51985fc","Type":"ContainerStarted","Data":"ff4ad956c92b18a2d1d1eeb31cdc1f1950bd2e04b32aaafda3a878cada2fc716"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.975718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" event={"ID":"406f198b-ce58-49b3-a79d-1793c51985fc","Type":"ContainerStarted","Data":"f5800b71b79371ccbf3f33537e272dc47ae9834916f28c1b3db641d850c19994"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.977277 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.978249 4831 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nmvzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.978328 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" podUID="406f198b-ce58-49b3-a79d-1793c51985fc" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.980453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ntw62" event={"ID":"24253089-f34c-4d7a-816f-49c18af92c20","Type":"ContainerStarted","Data":"e9196a65d7f0f5289b9d5d2fffcabece038ef5e261b6ad36b82aaa0db536f636"} Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.980508 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.983537 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-ntw62 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.983610 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ntw62" podUID="24253089-f34c-4d7a-816f-49c18af92c20" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 16:02:01 crc kubenswrapper[4831]: I0309 16:02:01.998893 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55110: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.002921 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerStarted","Data":"e9fd6d608eaa51f1a71bebccf735e56456be77dadc489720c1d15d8a0e5f1c31"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.016635 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" event={"ID":"75ce18b8-5e44-47af-801c-97f9963d1786","Type":"ContainerStarted","Data":"64127ae4fbd8da9a194448167777769ae9856542effb1b47349c187e62392630"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.016680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" event={"ID":"75ce18b8-5e44-47af-801c-97f9963d1786","Type":"ContainerStarted","Data":"ba2e365445be3d4157fb065b6a4a730bc57ebba2be153ab947104ed8b0092712"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.021660 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:02 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:02 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:02 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.021703 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.023536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.023603 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.036001 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ntw62" podStartSLOduration=181.035988127 podStartE2EDuration="3m1.035988127s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.034333774 +0000 UTC m=+249.168016197" watchObservedRunningTime="2026-03-09 16:02:02.035988127 +0000 UTC m=+249.169670550" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.036770 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" podStartSLOduration=180.036763568 podStartE2EDuration="3m0.036763568s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:01.993441652 +0000 UTC m=+249.127124095" watchObservedRunningTime="2026-03-09 16:02:02.036763568 +0000 UTC m=+249.170445991" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.043570 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" event={"ID":"a013f059-7440-4f06-88f6-a73f3286d228","Type":"ContainerStarted","Data":"b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.054946 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.057482 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.557468726 +0000 UTC m=+249.691151149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.089730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" event={"ID":"5630f338-fb68-4efc-8eb0-4b2c2fcae913","Type":"ContainerStarted","Data":"e506c043e2207428a1f86f8efc05deff6a3860dbbc1ad41615736c2eb0ca8b31"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.127140 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wgqf6" podStartSLOduration=181.127095398 podStartE2EDuration="3m1.127095398s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.11433049 +0000 UTC m=+249.248012923" watchObservedRunningTime="2026-03-09 16:02:02.127095398 +0000 UTC m=+249.260777831" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.159443 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.159697 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.659648959 +0000 UTC m=+249.793331382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.159979 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.160498 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.660487591 +0000 UTC m=+249.794170014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.168018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" event={"ID":"09e969ab-296e-4da0-8347-66c9227a149a","Type":"ContainerStarted","Data":"2b75b3f958aed6732509febcb6389aa684f364cb74f7f3059c2bcc6017b63125"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.168078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" event={"ID":"09e969ab-296e-4da0-8347-66c9227a149a","Type":"ContainerStarted","Data":"ab1e04fd28128b2b2fff27c7e5143ff24ebd23b693c9897c94f4088b62f823c2"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.209893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5gntv" event={"ID":"abca9390-cbde-4734-910a-433bc590e42a","Type":"ContainerStarted","Data":"3ab4f4c5602586a12c34812d668f508d0043b5b4381d4fdc03c9f4ddadc6bba9"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.211159 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55126: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.232886 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" podStartSLOduration=180.232870786 podStartE2EDuration="3m0.232870786s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.230974966 +0000 UTC m=+249.364657389" watchObservedRunningTime="2026-03-09 16:02:02.232870786 +0000 UTC m=+249.366553209" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.244036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" event={"ID":"6723899f-ecd5-4a9b-abd9-c824d873ae92","Type":"ContainerStarted","Data":"2d879d7e526858f0fd1fc95974dff650acec22c8fc52aa47ee86184027184af8"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.263922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.265126 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.765098068 +0000 UTC m=+249.898780541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.296702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" event={"ID":"462165ba-5266-4604-a449-6dbd4faf67b3","Type":"ContainerStarted","Data":"c72202cfc03185e2e8b61e1adc6500f467af6c472ad72a2422e75bf56f899f46"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.314819 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55132: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.314934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" event={"ID":"afa31e98-60fb-4eb4-8090-e9daf55a5c6c","Type":"ContainerStarted","Data":"3822a0cc96e6a80123ca99fc81e96c7195ab422603d458030f94fca5c5f438cc"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.314997 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" event={"ID":"afa31e98-60fb-4eb4-8090-e9daf55a5c6c","Type":"ContainerStarted","Data":"42411aca9e2caa366fdcb613035949c0fab11f45eba031bbd4590d2de7a53ffe"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.361570 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvs22" podStartSLOduration=180.36154966 podStartE2EDuration="3m0.36154966s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.357139203 +0000 UTC m=+249.490821626" watchObservedRunningTime="2026-03-09 16:02:02.36154966 +0000 UTC m=+249.495232083" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.361705 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h96p9" podStartSLOduration=181.361701734 podStartE2EDuration="3m1.361701734s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.281654846 +0000 UTC m=+249.415337269" watchObservedRunningTime="2026-03-09 16:02:02.361701734 +0000 UTC m=+249.495384157" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.367050 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.368987 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.868971376 +0000 UTC m=+250.002653799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.393413 4831 generic.go:334] "Generic (PLEG): container finished" podID="7ac9e931-216f-42bf-9364-f39b8cbe2b60" containerID="3521707d7bd4c1ce9a01a2c139fdd9e4a0ed301af3aae4f61dbcf42d6201e59b" exitCode=0 Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.394309 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" event={"ID":"7ac9e931-216f-42bf-9364-f39b8cbe2b60","Type":"ContainerDied","Data":"3521707d7bd4c1ce9a01a2c139fdd9e4a0ed301af3aae4f61dbcf42d6201e59b"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.394360 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" event={"ID":"7ac9e931-216f-42bf-9364-f39b8cbe2b60","Type":"ContainerStarted","Data":"4de129a7d6ec281d3bf71a721c709ce3e809d1469a1ed2612fb3ac03158dcbf9"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.408348 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2259m" event={"ID":"3fb50200-700a-4355-8e9b-65b2cb24dc02","Type":"ContainerStarted","Data":"23f7dff0d91811c700c05b21f08d371b898e83487ac7bc3774a239dbae934a5a"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.416581 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ln52t" podStartSLOduration=181.416539805 podStartE2EDuration="3m1.416539805s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.41560927 +0000 UTC m=+249.549291693" watchObservedRunningTime="2026-03-09 16:02:02.416539805 +0000 UTC m=+249.550222238" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.426875 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.471971 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.472421 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.972377112 +0000 UTC m=+250.106059535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.472683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.473436 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:02.973422749 +0000 UTC m=+250.107105172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.474262 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55148: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.480591 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vqjxn" podStartSLOduration=181.480567398 podStartE2EDuration="3m1.480567398s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.474711444 +0000 UTC m=+249.608393867" watchObservedRunningTime="2026-03-09 16:02:02.480567398 +0000 UTC m=+249.614249811" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.485115 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" event={"ID":"f9d94f81-ac48-4829-b0c0-54339568c8f0","Type":"ContainerStarted","Data":"6afbfc039561d20ade629e748bb90c4bf137fba35206cc4b59c0d4646da4cca2"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.485162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" event={"ID":"f9d94f81-ac48-4829-b0c0-54339568c8f0","Type":"ContainerStarted","Data":"fa94a6152ee52e1fd0d6124aae59468bffeb15c9ea76a195ea6ddf784483574c"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.516914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" event={"ID":"5032087a-e1c3-47cb-9b31-2983fe8aff99","Type":"ContainerStarted","Data":"12fb3a959256bbfab9471aae899d9f0a5665e33a4cff673f9578747cb1a23f4a"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.516971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" event={"ID":"5032087a-e1c3-47cb-9b31-2983fe8aff99","Type":"ContainerStarted","Data":"4b15df7744bedec7b69f597bdd0cc0f2c9ad59f0492920a84d07a27eeb189810"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.519472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4ljk" event={"ID":"a650250c-ac76-4486-a46d-cd53b83afe05","Type":"ContainerStarted","Data":"37a8e64c8ea84636200d423971d6746fda24f8153ab52997eb47ed32e86f3e5b"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.548435 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" event={"ID":"a864a900-0b20-49ae-a846-736a9784eee1","Type":"ContainerStarted","Data":"5228801ee02afacb160cb36839a53fc80274c588dd0e52a9c24112857358c403"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.572275 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" event={"ID":"98b2ff36-f824-46b5-ad2a-d01710453cb2","Type":"ContainerStarted","Data":"c6d89cb45fce0ac1d3908afe4d20df5d37be5cbd3df59a319ea95f32374b094c"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.572352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" event={"ID":"98b2ff36-f824-46b5-ad2a-d01710453cb2","Type":"ContainerStarted","Data":"4794c060672403b8b929f2231d40b366654534bff9c7f16794e7440a761df562"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.574382 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.576507 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.076482676 +0000 UTC m=+250.210165099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.592215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" event={"ID":"33678e26-b1b2-419f-93ef-85ba9e935155","Type":"ContainerStarted","Data":"9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.620512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" event={"ID":"b62fcd76-8790-4a1e-898d-b9654a876ddc","Type":"ContainerStarted","Data":"e9834ed82211d0945667f6cb206f557ef668927ceaf0926c7430747745928885"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.647570 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" event={"ID":"bfaaf56f-0a5d-434c-8bc7-f764c883e204","Type":"ContainerStarted","Data":"4a15598fcd4421c131be5b2cef7e29299ee3b3533098b421262cbdb333fd3e15"} Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.648946 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55164: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.662324 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" podStartSLOduration=181.662303986 podStartE2EDuration="3m1.662303986s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.657956131 +0000 UTC m=+249.791638554" watchObservedRunningTime="2026-03-09 16:02:02.662303986 +0000 UTC m=+249.795986409" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.665384 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-db24t" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.676904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.679354 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.179338627 +0000 UTC m=+250.313021050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.703596 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-npdkz" podStartSLOduration=181.703580278 podStartE2EDuration="3m1.703580278s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.70250626 +0000 UTC m=+249.836188683" watchObservedRunningTime="2026-03-09 16:02:02.703580278 +0000 UTC m=+249.837262701" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.745600 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" podStartSLOduration=181.745582229 podStartE2EDuration="3m1.745582229s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.743453803 +0000 UTC m=+249.877136226" watchObservedRunningTime="2026-03-09 16:02:02.745582229 +0000 UTC m=+249.879264652" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.778001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.778174 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.278149931 +0000 UTC m=+250.411832354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.778710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.782142 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.282133546 +0000 UTC m=+250.415815969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.782361 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nw95b" podStartSLOduration=181.782344082 podStartE2EDuration="3m1.782344082s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.779692972 +0000 UTC m=+249.913375395" watchObservedRunningTime="2026-03-09 16:02:02.782344082 +0000 UTC m=+249.916026505" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.792906 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55180: no serving certificate available for the kubelet" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.830830 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" podStartSLOduration=181.830813924 podStartE2EDuration="3m1.830813924s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.830735902 +0000 UTC m=+249.964418325" watchObservedRunningTime="2026-03-09 16:02:02.830813924 +0000 UTC m=+249.964496347" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.880088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.880677 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.380643992 +0000 UTC m=+250.514326415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.940908 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-fcd87" podStartSLOduration=180.940872646 podStartE2EDuration="3m0.940872646s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:02.939693694 +0000 UTC m=+250.073376127" watchObservedRunningTime="2026-03-09 16:02:02.940872646 +0000 UTC m=+250.074555069" Mar 09 16:02:02 crc kubenswrapper[4831]: I0309 16:02:02.985020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:02 crc kubenswrapper[4831]: E0309 16:02:02.985719 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.485706711 +0000 UTC m=+250.619389124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.025009 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.046702 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:03 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:03 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:03 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.046788 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.057333 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.057644 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.063210 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.082968 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.087550 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.087738 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.58771313 +0000 UTC m=+250.721395553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.087960 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.088364 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.588346807 +0000 UTC m=+250.722029220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.188957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.189810 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.689794161 +0000 UTC m=+250.823476584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.294438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.295739 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.795718093 +0000 UTC m=+250.929400516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.395757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.395993 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.895959075 +0000 UTC m=+251.029641508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.396237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.396644 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.896633552 +0000 UTC m=+251.030316165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.497563 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.497741 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.997714506 +0000 UTC m=+251.131396929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.497887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.498227 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:03.99821508 +0000 UTC m=+251.131897503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.516845 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55186: no serving certificate available for the kubelet" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.599343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.599553 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.09952572 +0000 UTC m=+251.233208143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.599846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.600125 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.100118105 +0000 UTC m=+251.233800528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.694945 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" event={"ID":"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763","Type":"ContainerStarted","Data":"6d980b35e7144aa85a3b7181c2b7d8cd51bd18c0dcf12f31e8a3a1fd76927188"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.700523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.700714 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.200685806 +0000 UTC m=+251.334368229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.700837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.701314 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.201295612 +0000 UTC m=+251.334978095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.720211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerStarted","Data":"d2a9f36ba413b9f191923c5026ac9a9c7cecfadf5312f209da3d867a4930a14e"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.720530 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.727741 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lv25s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.727831 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.744266 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4vggg" event={"ID":"5032087a-e1c3-47cb-9b31-2983fe8aff99","Type":"ContainerStarted","Data":"0abb7ca7cdc969007e8773b3959df8bc6f29eeeac985cda457d684b520af75f9"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.766668 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" event={"ID":"5630f338-fb68-4efc-8eb0-4b2c2fcae913","Type":"ContainerStarted","Data":"7855dbe8bba57503391582626e489581f9607d37e774cca43813326db58c71b6"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.786334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" event={"ID":"afa7f33f-fe13-438d-bd39-594f917e6015","Type":"ContainerStarted","Data":"9d0947801705f17e8564f61ea602a25f006d4a4a71d2d4690488ceb579c2d79b"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.788676 4831 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x9895 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.788725 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" podUID="afa7f33f-fe13-438d-bd39-594f917e6015" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.788909 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.801930 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.803836 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.303817594 +0000 UTC m=+251.437500027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.814609 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" event={"ID":"c5d72ddf-8850-405d-aa67-ae759dad90be","Type":"ContainerStarted","Data":"11a04c873e20670915697f7fcf3b6ecfb7a670fbf195cdc5db98c05e82c99306"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.814665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" event={"ID":"c5d72ddf-8850-405d-aa67-ae759dad90be","Type":"ContainerStarted","Data":"7d806f4f2bbab7bb5fc88f4a22f25127d3426fd71f81154a9a0da845a034cc04"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.846016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" event={"ID":"7ac9e931-216f-42bf-9364-f39b8cbe2b60","Type":"ContainerStarted","Data":"646e13d1d0a9bf2dd6ada0c17b5588fa372307f61053e53c69e134870395fa51"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.846772 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.868725 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" event={"ID":"09e969ab-296e-4da0-8347-66c9227a149a","Type":"ContainerStarted","Data":"d65f48657790f322e8a8f35293f1240aec312927315ef83aa2cab7623aa0040a"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.892636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wk2hz" event={"ID":"a864a900-0b20-49ae-a846-736a9784eee1","Type":"ContainerStarted","Data":"2c21ed75f5314edbb7f00f9784fdd4746c18c9f51be4dba1bb8b0504824e04c7"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.906142 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:03 crc kubenswrapper[4831]: E0309 16:02:03.907799 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.407786645 +0000 UTC m=+251.541469068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.932146 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4ljk" event={"ID":"a650250c-ac76-4486-a46d-cd53b83afe05","Type":"ContainerStarted","Data":"c2251e06de2e59749c3f7d26ab04c4aefa1761104b7fd36760daf96a2e74156c"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.932203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z4ljk" event={"ID":"a650250c-ac76-4486-a46d-cd53b83afe05","Type":"ContainerStarted","Data":"77ebb25dccb96218b60a1a746999a1c670aa089429162c8553523f5a51c0c475"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.932942 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z4ljk" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.934147 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lkkk" podStartSLOduration=182.934132212 podStartE2EDuration="3m2.934132212s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:03.932849278 +0000 UTC m=+251.066531701" watchObservedRunningTime="2026-03-09 16:02:03.934132212 +0000 UTC m=+251.067814635" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.955982 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qwsf" event={"ID":"bfaaf56f-0a5d-434c-8bc7-f764c883e204","Type":"ContainerStarted","Data":"a0fdc9f389009c78caf8319e956f5184b121af5893bd48ac1d8b5f6347f02b27"} Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.957868 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-ntw62 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.957914 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ntw62" podUID="24253089-f34c-4d7a-816f-49c18af92c20" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.961928 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerName="controller-manager" containerID="cri-o://2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55" gracePeriod=30 Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.962124 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerName="route-controller-manager" containerID="cri-o://53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199" gracePeriod=30 Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.974352 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ntzjm" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.979517 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" podStartSLOduration=182.979495992 podStartE2EDuration="3m2.979495992s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:03.979111101 +0000 UTC m=+251.112793524" watchObservedRunningTime="2026-03-09 16:02:03.979495992 +0000 UTC m=+251.113178415" Mar 09 16:02:03 crc kubenswrapper[4831]: I0309 16:02:03.985419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nmvzm" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.008253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.010950 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.510930523 +0000 UTC m=+251.644612966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.019898 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:04 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:04 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:04 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.019989 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.071855 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" podStartSLOduration=182.071829763 podStartE2EDuration="3m2.071829763s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:04.044351286 +0000 UTC m=+251.178033709" watchObservedRunningTime="2026-03-09 16:02:04.071829763 +0000 UTC m=+251.205512186" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.112967 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vctbp" podStartSLOduration=183.112942081 podStartE2EDuration="3m3.112942081s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:04.112119169 +0000 UTC m=+251.245801592" watchObservedRunningTime="2026-03-09 16:02:04.112942081 +0000 UTC m=+251.246624504" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.113954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.114433 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.61441609 +0000 UTC m=+251.748098513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.177337 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps5sb" podStartSLOduration=183.177305794 podStartE2EDuration="3m3.177305794s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:04.158019603 +0000 UTC m=+251.291702026" watchObservedRunningTime="2026-03-09 16:02:04.177305794 +0000 UTC m=+251.310988217" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.200473 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.201476 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.206867 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" podStartSLOduration=182.206848775 podStartE2EDuration="3m2.206848775s" podCreationTimestamp="2026-03-09 15:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:04.193248715 +0000 UTC m=+251.326931148" watchObservedRunningTime="2026-03-09 16:02:04.206848775 +0000 UTC m=+251.340531198" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.209516 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.214950 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.215292 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.715276818 +0000 UTC m=+251.848959241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.215341 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.215729 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.233419 4831 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.321285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.321342 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.321444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.321751 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.821734394 +0000 UTC m=+251.955416817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.350818 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z4ljk" podStartSLOduration=10.350802143 podStartE2EDuration="10.350802143s" podCreationTimestamp="2026-03-09 16:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:04.305586067 +0000 UTC m=+251.439268490" watchObservedRunningTime="2026-03-09 16:02:04.350802143 +0000 UTC m=+251.484484566" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.421983 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.422133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.422185 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.422343 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.422452 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:04.922434228 +0000 UTC m=+252.056116651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.465138 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.523717 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.523974 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.023962394 +0000 UTC m=+252.157644817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.577018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.627818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.628008 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.127991326 +0000 UTC m=+252.261673749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.628134 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.628540 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.12853358 +0000 UTC m=+252.262216003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.684110 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.717588 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.729480 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.729558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config\") pod \"f07d0628-187a-492f-8dee-f1e28ba448cb\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.729625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtfc\" (UniqueName: \"kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc\") pod \"f07d0628-187a-492f-8dee-f1e28ba448cb\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.729656 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca\") pod \"f07d0628-187a-492f-8dee-f1e28ba448cb\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.729674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert\") pod \"f07d0628-187a-492f-8dee-f1e28ba448cb\" (UID: \"f07d0628-187a-492f-8dee-f1e28ba448cb\") " Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.730978 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.23094464 +0000 UTC m=+252.364627073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.732199 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config" (OuterVolumeSpecName: "config") pod "f07d0628-187a-492f-8dee-f1e28ba448cb" (UID: "f07d0628-187a-492f-8dee-f1e28ba448cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.732756 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "f07d0628-187a-492f-8dee-f1e28ba448cb" (UID: "f07d0628-187a-492f-8dee-f1e28ba448cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.733942 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.734134 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerName="controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.734150 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerName="controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.734160 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerName="route-controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.734166 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerName="route-controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.734281 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerName="route-controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.734303 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerName="controller-manager" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.734627 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.745516 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f07d0628-187a-492f-8dee-f1e28ba448cb" (UID: "f07d0628-187a-492f-8dee-f1e28ba448cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.747365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc" (OuterVolumeSpecName: "kube-api-access-tgtfc") pod "f07d0628-187a-492f-8dee-f1e28ba448cb" (UID: "f07d0628-187a-492f-8dee-f1e28ba448cb"). InnerVolumeSpecName "kube-api-access-tgtfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.763056 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.830943 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config\") pod \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831011 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert\") pod \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831209 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles\") pod \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca\") pod \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831303 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktdhv\" (UniqueName: \"kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv\") pod \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\" (UID: \"850f7c83-ddcb-4209-84d7-27c63c8b3e1c\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831688 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llblc\" (UniqueName: \"kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831856 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.831907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.832020 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgtfc\" (UniqueName: \"kubernetes.io/projected/f07d0628-187a-492f-8dee-f1e28ba448cb-kube-api-access-tgtfc\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.832040 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.832053 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07d0628-187a-492f-8dee-f1e28ba448cb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.832066 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d0628-187a-492f-8dee-f1e28ba448cb-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.832424 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.332388313 +0000 UTC m=+252.466070736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mbssj" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.836093 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "850f7c83-ddcb-4209-84d7-27c63c8b3e1c" (UID: "850f7c83-ddcb-4209-84d7-27c63c8b3e1c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.836676 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config" (OuterVolumeSpecName: "config") pod "850f7c83-ddcb-4209-84d7-27c63c8b3e1c" (UID: "850f7c83-ddcb-4209-84d7-27c63c8b3e1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.837081 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "850f7c83-ddcb-4209-84d7-27c63c8b3e1c" (UID: "850f7c83-ddcb-4209-84d7-27c63c8b3e1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.841153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "850f7c83-ddcb-4209-84d7-27c63c8b3e1c" (UID: "850f7c83-ddcb-4209-84d7-27c63c8b3e1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.845919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv" (OuterVolumeSpecName: "kube-api-access-ktdhv") pod "850f7c83-ddcb-4209-84d7-27c63c8b3e1c" (UID: "850f7c83-ddcb-4209-84d7-27c63c8b3e1c"). InnerVolumeSpecName "kube-api-access-ktdhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.869568 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55188: no serving certificate available for the kubelet" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.874571 4831 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T16:02:04.233640614Z","Handler":null,"Name":""} Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935007 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935322 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llblc\" (UniqueName: \"kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935457 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935557 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935577 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktdhv\" (UniqueName: \"kubernetes.io/projected/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-kube-api-access-ktdhv\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935589 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935600 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.935612 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/850f7c83-ddcb-4209-84d7-27c63c8b3e1c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:04 crc kubenswrapper[4831]: E0309 16:02:04.939025 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 16:02:05.439000194 +0000 UTC m=+252.572682617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.940151 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.940563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.940742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:04 crc kubenswrapper[4831]: I0309 16:02:04.994159 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llblc\" (UniqueName: \"kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc\") pod \"route-controller-manager-76556f7bc-nr6w4\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.001961 4831 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.002033 4831 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.011696 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:05 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:05 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:05 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.011798 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.013982 4831 generic.go:334] "Generic (PLEG): container finished" podID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" containerID="2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55" exitCode=0 Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.014105 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" event={"ID":"850f7c83-ddcb-4209-84d7-27c63c8b3e1c","Type":"ContainerDied","Data":"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.014208 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" event={"ID":"850f7c83-ddcb-4209-84d7-27c63c8b3e1c","Type":"ContainerDied","Data":"6f9f16bba3afefc953799c7df2bc2cef29f1775c25dfb1874981337f8f2ca6b0"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.014230 4831 scope.go:117] "RemoveContainer" containerID="2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.014424 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c7t6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.036352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.039572 4831 generic.go:334] "Generic (PLEG): container finished" podID="f07d0628-187a-492f-8dee-f1e28ba448cb" containerID="53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199" exitCode=0 Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.039652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" event={"ID":"f07d0628-187a-492f-8dee-f1e28ba448cb","Type":"ContainerDied","Data":"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.039682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" event={"ID":"f07d0628-187a-492f-8dee-f1e28ba448cb","Type":"ContainerDied","Data":"f4af8cdfadf7295f16f6f24b8fdd1c2cb15c425504cef1d89350ddaa74e6ebd8"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.039756 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.056866 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.056927 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.061290 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.062097 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c7t6"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.075627 4831 scope.go:117] "RemoveContainer" containerID="2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.076868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" event={"ID":"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763","Type":"ContainerStarted","Data":"c6e39e4b17977b99c181c820e2b164e940f37eaa5081094f42e94f642a8577ce"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.076938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" event={"ID":"1a3ad7b1-fa93-4b88-bd28-8aaea19d6763","Type":"ContainerStarted","Data":"576be80b866072ee95bf1ffbab6c707dfe10648cc4320644bd6eb4afccd1cc08"} Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.077103 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lv25s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.077155 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.086067 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:05 crc kubenswrapper[4831]: E0309 16:02:05.088800 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55\": container with ID starting with 2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55 not found: ID does not exist" containerID="2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.089808 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55"} err="failed to get container status \"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55\": rpc error: code = NotFound desc = could not find container \"2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55\": container with ID starting with 2d2a2d9f5246df039b4a36a903e139fd53677f4e5cc1892c624e29f25b854b55 not found: ID does not exist" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.089857 4831 scope.go:117] "RemoveContainer" containerID="53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.131598 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-v7slb" podStartSLOduration=11.131578428 podStartE2EDuration="11.131578428s" podCreationTimestamp="2026-03-09 16:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:05.130171121 +0000 UTC m=+252.263853554" watchObservedRunningTime="2026-03-09 16:02:05.131578428 +0000 UTC m=+252.265260871" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.145172 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mbssj\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.160257 4831 scope.go:117] "RemoveContainer" containerID="53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.165981 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:02:05 crc kubenswrapper[4831]: E0309 16:02:05.166807 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199\": container with ID starting with 53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199 not found: ID does not exist" containerID="53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.166863 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199"} err="failed to get container status \"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199\": rpc error: code = NotFound desc = could not find container \"53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199\": container with ID starting with 53808799a12a4ffb58cb929e5cb5384e7df32e8f48eb44c68b3bcb16c6937199 not found: ID does not exist" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.181164 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k4n9g"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.211041 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.223697 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.230128 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: W0309 16:02:05.231815 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7d8b598_0b74_4ce6_ad15_8ca4c81fe509.slice/crio-6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3 WatchSource:0}: Error finding container 6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3: Status 404 returned error can't find the container with id 6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3 Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.232723 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.233204 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x9895" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.242921 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.245920 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.260991 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.298902 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tbp9n" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.309843 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.346134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.346631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.346677 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57kkr\" (UniqueName: \"kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.416906 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.418032 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.423896 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.441248 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448180 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448243 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6lf\" (UniqueName: \"kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.448293 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57kkr\" (UniqueName: \"kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.450136 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.455267 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.471380 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57kkr\" (UniqueName: \"kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr\") pod \"certified-operators-gbpnl\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.538836 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.546449 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rd78d" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.550157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.550463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.550492 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6lf\" (UniqueName: \"kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.551691 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.553193 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.563864 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.576515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6lf\" (UniqueName: \"kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf\") pod \"community-operators-5rzt7\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.662188 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850f7c83-ddcb-4209-84d7-27c63c8b3e1c" path="/var/lib/kubelet/pods/850f7c83-ddcb-4209-84d7-27c63c8b3e1c/volumes" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.663519 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.664114 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07d0628-187a-492f-8dee-f1e28ba448cb" path="/var/lib/kubelet/pods/f07d0628-187a-492f-8dee-f1e28ba448cb/volumes" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.667414 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.670143 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.670240 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.759692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.760316 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmr6\" (UniqueName: \"kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.760421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.760486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: W0309 16:02:05.765596 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf748a5bd_955b_46cc_8953_414be09f5493.slice/crio-ca37defe7dfd0138093bcf0b5a8d7a84188ef5d24c7ca97ea94c5a8b42d92157 WatchSource:0}: Error finding container ca37defe7dfd0138093bcf0b5a8d7a84188ef5d24c7ca97ea94c5a8b42d92157: Status 404 returned error can't find the container with id ca37defe7dfd0138093bcf0b5a8d7a84188ef5d24c7ca97ea94c5a8b42d92157 Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.765655 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.811128 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.812463 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.831911 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.865337 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.865584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.866472 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmr6\" (UniqueName: \"kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.866561 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.866579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27cp\" (UniqueName: \"kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.866643 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.866665 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.867600 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.867927 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.925940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmr6\" (UniqueName: \"kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6\") pod \"certified-operators-gqxw6\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.967973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27cp\" (UniqueName: \"kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.968053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.968098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.968546 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:05 crc kubenswrapper[4831]: I0309 16:02:05.968983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.003157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27cp\" (UniqueName: \"kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp\") pod \"community-operators-cn4z8\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.004182 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.013072 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:06 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:06 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:06 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.013119 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.096254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" event={"ID":"643b13ec-dd30-4a47-b123-76c9c3a1b5b7","Type":"ContainerStarted","Data":"3b931bb92a353621b063f98e2071f4f50b9bddae1d5f24ff55e213dfe044ec16"} Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.151879 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.169264 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.175180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509","Type":"ContainerStarted","Data":"2e338e5c60576e5b8f7f2dc7b630ebdc4635264f0737ef252daa6e5bc0cfdac5"} Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.175268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509","Type":"ContainerStarted","Data":"6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3"} Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.184426 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.188161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" event={"ID":"f748a5bd-955b-46cc-8953-414be09f5493","Type":"ContainerStarted","Data":"ca37defe7dfd0138093bcf0b5a8d7a84188ef5d24c7ca97ea94c5a8b42d92157"} Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.190938 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.202525 4831 patch_prober.go:28] interesting pod/route-controller-manager-76556f7bc-nr6w4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.202590 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.227056 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.227037578 podStartE2EDuration="2.227037578s" podCreationTimestamp="2026-03-09 16:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:06.200579628 +0000 UTC m=+253.334262061" watchObservedRunningTime="2026-03-09 16:02:06.227037578 +0000 UTC m=+253.360720001" Mar 09 16:02:06 crc kubenswrapper[4831]: W0309 16:02:06.228622 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ced3eb3_5570_485a_9828_4c509ecd19f2.slice/crio-31f8144c209e24c28a3821945d00e07ca3426b684827fe6144ff194d4d3557aa WatchSource:0}: Error finding container 31f8144c209e24c28a3821945d00e07ca3426b684827fe6144ff194d4d3557aa: Status 404 returned error can't find the container with id 31f8144c209e24c28a3821945d00e07ca3426b684827fe6144ff194d4d3557aa Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.231606 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" podStartSLOduration=3.231588808 podStartE2EDuration="3.231588808s" podCreationTimestamp="2026-03-09 16:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:06.226708989 +0000 UTC m=+253.360391412" watchObservedRunningTime="2026-03-09 16:02:06.231588808 +0000 UTC m=+253.365271231" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.401522 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:02:06 crc kubenswrapper[4831]: W0309 16:02:06.418544 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ed410a_1efe_4e39_853c_f87a9dc04437.slice/crio-41fd6411befd516283cacdc8c66b717a9b070cc6e667c877866b0d6e9443380e WatchSource:0}: Error finding container 41fd6411befd516283cacdc8c66b717a9b070cc6e667c877866b0d6e9443380e: Status 404 returned error can't find the container with id 41fd6411befd516283cacdc8c66b717a9b070cc6e667c877866b0d6e9443380e Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.529815 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.881211 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.883087 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.889184 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.889746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.890697 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.890697 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.890929 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.890972 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.894013 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.909192 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.986079 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.986293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.986459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.986558 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqgb\" (UniqueName: \"kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:06 crc kubenswrapper[4831]: I0309 16:02:06.986618 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.006126 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:07 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:07 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:07 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.006272 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.087366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.087470 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqgb\" (UniqueName: \"kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.087513 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.087561 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.087609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.088743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.092004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.092198 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.109526 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.118077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqgb\" (UniqueName: \"kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb\") pod \"controller-manager-66d8855b44-vlmkf\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.199040 4831 generic.go:334] "Generic (PLEG): container finished" podID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerID="2d79d6e84601cfe1fa803e497d038b68ccfcc51af3b4dc406c11a1f3aaad2c5a" exitCode=0 Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.199095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerDied","Data":"2d79d6e84601cfe1fa803e497d038b68ccfcc51af3b4dc406c11a1f3aaad2c5a"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.199153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerStarted","Data":"365095f40576a62895edc0db0b0e5470c6d4d2c274954b102d144fce369d0c75"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.204055 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerID="686f1ac64d56b5b906b3f2599e2e4fd33a83525256d31cce6bead4796380af4d" exitCode=0 Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.204137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerDied","Data":"686f1ac64d56b5b906b3f2599e2e4fd33a83525256d31cce6bead4796380af4d"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.204163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerStarted","Data":"db236957ef910c7fe0e077e0db3133cea89b13d91b169f954b26074258b0a3a2"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.249920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" event={"ID":"643b13ec-dd30-4a47-b123-76c9c3a1b5b7","Type":"ContainerStarted","Data":"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.251376 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.256669 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.261880 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerID="8f1234aec641c5970ba89cf2929049c77593ad0a84b1f6dc55895d0dfc28e2a8" exitCode=0 Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.261996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerDied","Data":"8f1234aec641c5970ba89cf2929049c77593ad0a84b1f6dc55895d0dfc28e2a8"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.262056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerStarted","Data":"41fd6411befd516283cacdc8c66b717a9b070cc6e667c877866b0d6e9443380e"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.270134 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" containerID="2e338e5c60576e5b8f7f2dc7b630ebdc4635264f0737ef252daa6e5bc0cfdac5" exitCode=0 Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.270416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509","Type":"ContainerDied","Data":"2e338e5c60576e5b8f7f2dc7b630ebdc4635264f0737ef252daa6e5bc0cfdac5"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.276705 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" podStartSLOduration=186.276684996 podStartE2EDuration="3m6.276684996s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:07.270741508 +0000 UTC m=+254.404423931" watchObservedRunningTime="2026-03-09 16:02:07.276684996 +0000 UTC m=+254.410367419" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.280881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" event={"ID":"f748a5bd-955b-46cc-8953-414be09f5493","Type":"ContainerStarted","Data":"42863b4845feb55005080656359d733d83aa858f9681c1a17cd9470a4e4d28ca"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.285478 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.293286 4831 generic.go:334] "Generic (PLEG): container finished" podID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerID="7f1045c3112b483866d8d4d2a6f8915b54dbfd8d84f6d698612b592f4d50b55f" exitCode=0 Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.293636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerDied","Data":"7f1045c3112b483866d8d4d2a6f8915b54dbfd8d84f6d698612b592f4d50b55f"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.293672 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerStarted","Data":"31f8144c209e24c28a3821945d00e07ca3426b684827fe6144ff194d4d3557aa"} Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.424554 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.425288 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.427697 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.429597 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.431627 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.434121 4831 patch_prober.go:28] interesting pod/console-f9d7485db-wgqf6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.434159 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wgqf6" podUID="642242a2-404d-4008-aacf-ebb38010d636" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.464274 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.497175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.501828 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.501956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zwp\" (UniqueName: \"kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.517091 4831 ???:1] "http: TLS handshake error from 192.168.126.11:55190: no serving certificate available for the kubelet" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.608567 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.608647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.608681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zwp\" (UniqueName: \"kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.609355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.610262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.639574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zwp\" (UniqueName: \"kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp\") pod \"redhat-marketplace-9rfth\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.684440 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-ntw62 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.684501 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ntw62" podUID="24253089-f34c-4d7a-816f-49c18af92c20" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.684608 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-ntw62 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.684662 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ntw62" podUID="24253089-f34c-4d7a-816f-49c18af92c20" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.714600 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.715377 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.722227 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.722300 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.730592 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.766114 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.807425 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.815031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.815182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.835489 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.838579 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.848991 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.859925 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919621 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919670 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919699 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmll4\" (UniqueName: \"kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919793 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919849 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.919941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:07 crc kubenswrapper[4831]: I0309 16:02:07.969813 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.002993 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.019552 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:08 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:08 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:08 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.019627 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.020988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.021014 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.021036 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmll4\" (UniqueName: \"kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.021707 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.021780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.035778 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.047088 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmll4\" (UniqueName: \"kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4\") pod \"redhat-marketplace-blz27\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.178524 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.221554 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.354670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerStarted","Data":"26a5678493219447a690a7cd6c752ef4846596e60d89ca67093b7e2c2731a53d"} Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.359480 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" event={"ID":"2557efd7-1557-4555-a540-4d0e421f7140","Type":"ContainerStarted","Data":"4fc1742310c89bb623c402b801c203fc7ec20b6c05cdabcb0ad4dc481d388111"} Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.359556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" event={"ID":"2557efd7-1557-4555-a540-4d0e421f7140","Type":"ContainerStarted","Data":"0eaf3af1d3bd350465692cf303d5008bf78450dcec94ab1fabf82a42c4240390"} Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.360636 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.396262 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" podStartSLOduration=5.394946787 podStartE2EDuration="5.394946787s" podCreationTimestamp="2026-03-09 16:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:08.393673994 +0000 UTC m=+255.527356417" watchObservedRunningTime="2026-03-09 16:02:08.394946787 +0000 UTC m=+255.528629210" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.404075 4831 patch_prober.go:28] interesting pod/controller-manager-66d8855b44-vlmkf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.404206 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.435747 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.447431 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.448488 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.457900 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.459745 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.653449 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4bw\" (UniqueName: \"kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.654005 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.654068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.659543 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:08 crc kubenswrapper[4831]: W0309 16:02:08.731231 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8980d825_3bbd_4832_807a_d55b47c20e18.slice/crio-775422565a9fdda2adc420763238f6c1fa0c7757174e5d7369a8431d04c2636a WatchSource:0}: Error finding container 775422565a9fdda2adc420763238f6c1fa0c7757174e5d7369a8431d04c2636a: Status 404 returned error can't find the container with id 775422565a9fdda2adc420763238f6c1fa0c7757174e5d7369a8431d04c2636a Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.763189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4bw\" (UniqueName: \"kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.763359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.763538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.764342 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.764902 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.804228 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4bw\" (UniqueName: \"kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw\") pod \"redhat-operators-d4629\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.818728 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.820623 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.836592 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.903898 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.973962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.974095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:08 crc kubenswrapper[4831]: I0309 16:02:08.974142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c24jz\" (UniqueName: \"kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.007079 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:09 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Mar 09 16:02:09 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:09 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.007148 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.076241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.076375 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.076442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c24jz\" (UniqueName: \"kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.076849 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.076903 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.123081 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.127798 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c24jz\" (UniqueName: \"kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz\") pod \"redhat-operators-hw7pt\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.159505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.284752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir\") pod \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.284810 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access\") pod \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\" (UID: \"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509\") " Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.285123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" (UID: "a7d8b598-0b74-4ce6-ad15-8ca4c81fe509"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.299393 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" (UID: "a7d8b598-0b74-4ce6-ad15-8ca4c81fe509"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.388222 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.388262 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7d8b598-0b74-4ce6-ad15-8ca4c81fe509-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.430995 4831 generic.go:334] "Generic (PLEG): container finished" podID="3c796eb3-c14d-48be-882d-5ae13e12918a" containerID="249ef961987b962ba9f3fae85bcca9a7590a4d1db9e57ce72496eab9b129f4dc" exitCode=0 Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.431102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" event={"ID":"3c796eb3-c14d-48be-882d-5ae13e12918a","Type":"ContainerDied","Data":"249ef961987b962ba9f3fae85bcca9a7590a4d1db9e57ce72496eab9b129f4dc"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.447704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"42b4301d-b6b3-4ed5-b678-cce42b7e585e","Type":"ContainerStarted","Data":"0ba5f04d1434fe2273bbcf5c5cfc3b69da38ae391b84a7500da16e6eca506175"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.462650 4831 generic.go:334] "Generic (PLEG): container finished" podID="8980d825-3bbd-4832-807a-d55b47c20e18" containerID="468cbd5336e4c969443440f0347f4afce29f43ee0a144f0c9547a10bd9e8ae54" exitCode=0 Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.462757 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerDied","Data":"468cbd5336e4c969443440f0347f4afce29f43ee0a144f0c9547a10bd9e8ae54"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.462797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerStarted","Data":"775422565a9fdda2adc420763238f6c1fa0c7757174e5d7369a8431d04c2636a"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.471175 4831 generic.go:334] "Generic (PLEG): container finished" podID="44dba940-7ade-48aa-91e5-a358ac696126" containerID="d4c9820ce84ac2cc05f61e1baca7589fa5f3bac0491f62511ef3266436faef91" exitCode=0 Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.471327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerDied","Data":"d4c9820ce84ac2cc05f61e1baca7589fa5f3bac0491f62511ef3266436faef91"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.487079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a7d8b598-0b74-4ce6-ad15-8ca4c81fe509","Type":"ContainerDied","Data":"6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3"} Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.487120 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc3993907479f3e536b0f89d0e04f04f7d0ec3ce7581f83394d7ac6d2c6f4c3" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.487470 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.502748 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.593323 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:02:09 crc kubenswrapper[4831]: I0309 16:02:09.663422 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:02:09 crc kubenswrapper[4831]: W0309 16:02:09.741821 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2325f3d2_538f_4529_ac16_4c7c81cd13e3.slice/crio-20736501e625e988fe4ea51716d10fa98104952db2a6ad815c45246c7a7e9d0a WatchSource:0}: Error finding container 20736501e625e988fe4ea51716d10fa98104952db2a6ad815c45246c7a7e9d0a: Status 404 returned error can't find the container with id 20736501e625e988fe4ea51716d10fa98104952db2a6ad815c45246c7a7e9d0a Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.007630 4831 patch_prober.go:28] interesting pod/router-default-5444994796-q7h8z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 16:02:10 crc kubenswrapper[4831]: [+]has-synced ok Mar 09 16:02:10 crc kubenswrapper[4831]: [+]process-running ok Mar 09 16:02:10 crc kubenswrapper[4831]: healthz check failed Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.007740 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q7h8z" podUID="09973ee5-21ce-4c4f-b422-dea474d63482" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.505830 4831 generic.go:334] "Generic (PLEG): container finished" podID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerID="a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc" exitCode=0 Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.505967 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerDied","Data":"a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc"} Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.506250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerStarted","Data":"20736501e625e988fe4ea51716d10fa98104952db2a6ad815c45246c7a7e9d0a"} Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.514945 4831 generic.go:334] "Generic (PLEG): container finished" podID="1cdec29c-2ba7-47b6-9446-95791b883267" containerID="a529540ba0731cc548c65756a613acd3f817efafd943b361910810048b80c0d3" exitCode=0 Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.515010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerDied","Data":"a529540ba0731cc548c65756a613acd3f817efafd943b361910810048b80c0d3"} Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.515039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerStarted","Data":"97a7a6f68411edc9438b6148fa9193ecdcd828663ee63666ffec01a9d7cdac1b"} Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.520072 4831 generic.go:334] "Generic (PLEG): container finished" podID="42b4301d-b6b3-4ed5-b678-cce42b7e585e" containerID="8734c90f39608dd613d71859938c101c382e13d630a3821aab906dddab6b20b7" exitCode=0 Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.520155 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"42b4301d-b6b3-4ed5-b678-cce42b7e585e","Type":"ContainerDied","Data":"8734c90f39608dd613d71859938c101c382e13d630a3821aab906dddab6b20b7"} Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.612917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.612973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.613032 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.613133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.615217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.631688 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.638189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.649704 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.658508 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.714887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.719734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf8f784-8094-4b1c-96bb-f7997430a0ea-metrics-certs\") pod \"network-metrics-daemon-2597x\" (UID: \"bdf8f784-8094-4b1c-96bb-f7997430a0ea\") " pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.739345 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.794725 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.886579 4831 ???:1] "http: TLS handshake error from 192.168.126.11:60538: no serving certificate available for the kubelet" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.954582 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2597x" Mar 09 16:02:10 crc kubenswrapper[4831]: I0309 16:02:10.971635 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.014213 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.019762 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-q7h8z" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.126698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume\") pod \"3c796eb3-c14d-48be-882d-5ae13e12918a\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.126789 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxlk\" (UniqueName: \"kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk\") pod \"3c796eb3-c14d-48be-882d-5ae13e12918a\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.126850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume\") pod \"3c796eb3-c14d-48be-882d-5ae13e12918a\" (UID: \"3c796eb3-c14d-48be-882d-5ae13e12918a\") " Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.129133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c796eb3-c14d-48be-882d-5ae13e12918a" (UID: "3c796eb3-c14d-48be-882d-5ae13e12918a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.131985 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c796eb3-c14d-48be-882d-5ae13e12918a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.141262 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk" (OuterVolumeSpecName: "kube-api-access-wrxlk") pod "3c796eb3-c14d-48be-882d-5ae13e12918a" (UID: "3c796eb3-c14d-48be-882d-5ae13e12918a"). InnerVolumeSpecName "kube-api-access-wrxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.141354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c796eb3-c14d-48be-882d-5ae13e12918a" (UID: "3c796eb3-c14d-48be-882d-5ae13e12918a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.233992 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxlk\" (UniqueName: \"kubernetes.io/projected/3c796eb3-c14d-48be-882d-5ae13e12918a-kube-api-access-wrxlk\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.234027 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c796eb3-c14d-48be-882d-5ae13e12918a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.565895 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.565900 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6" event={"ID":"3c796eb3-c14d-48be-882d-5ae13e12918a","Type":"ContainerDied","Data":"b43a6697ed58c8d725a21a09f7041217cdf72beb807914f1fcdde3fae3c8f7f2"} Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.566368 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43a6697ed58c8d725a21a09f7041217cdf72beb807914f1fcdde3fae3c8f7f2" Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.583829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"60804a20d193568096ad86af492f7c79c75456709dbb4fcd11f29b95586141df"} Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.587033 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2597x"] Mar 09 16:02:11 crc kubenswrapper[4831]: W0309 16:02:11.626620 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf8f784_8094_4b1c_96bb_f7997430a0ea.slice/crio-e26270e3e4aa9459271e88fa0f3838c06845d3527d69bed3c4122ddb3b14b1e0 WatchSource:0}: Error finding container e26270e3e4aa9459271e88fa0f3838c06845d3527d69bed3c4122ddb3b14b1e0: Status 404 returned error can't find the container with id e26270e3e4aa9459271e88fa0f3838c06845d3527d69bed3c4122ddb3b14b1e0 Mar 09 16:02:11 crc kubenswrapper[4831]: I0309 16:02:11.904646 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.060842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir\") pod \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.060997 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access\") pod \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\" (UID: \"42b4301d-b6b3-4ed5-b678-cce42b7e585e\") " Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.061491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "42b4301d-b6b3-4ed5-b678-cce42b7e585e" (UID: "42b4301d-b6b3-4ed5-b678-cce42b7e585e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.075881 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "42b4301d-b6b3-4ed5-b678-cce42b7e585e" (UID: "42b4301d-b6b3-4ed5-b678-cce42b7e585e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.162880 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.163013 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42b4301d-b6b3-4ed5-b678-cce42b7e585e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.621175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2597x" event={"ID":"bdf8f784-8094-4b1c-96bb-f7997430a0ea","Type":"ContainerStarted","Data":"e26270e3e4aa9459271e88fa0f3838c06845d3527d69bed3c4122ddb3b14b1e0"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.628047 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.628264 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"42b4301d-b6b3-4ed5-b678-cce42b7e585e","Type":"ContainerDied","Data":"0ba5f04d1434fe2273bbcf5c5cfc3b69da38ae391b84a7500da16e6eca506175"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.628306 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba5f04d1434fe2273bbcf5c5cfc3b69da38ae391b84a7500da16e6eca506175" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.631969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2630a4c636464be187a31a13fa1557e522f14b6f81ec36b7b0c257542d9bc617"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.632157 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"573a8ae3284f4ba67bd0d86ad63e416521bbaceeb4176ef84907d1194b6507aa"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.632596 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.635283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"844bb09bc0ec7dd1d161c6be8e5fafccf674097377adb1b0898839d3ceb4fd32"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.638393 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z4ljk" Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.656680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f80028a0631cc77df85e1e6fdb16872cbd12d821787c0bad5a3eebd7dd0fc385"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.656733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e0235fd14191ab960a32c3a0e2747fb01cc93efab015ee923e75c0c060a26209"} Mar 09 16:02:12 crc kubenswrapper[4831]: I0309 16:02:12.673513 4831 ???:1] "http: TLS handshake error from 192.168.126.11:60552: no serving certificate available for the kubelet" Mar 09 16:02:13 crc kubenswrapper[4831]: I0309 16:02:13.669190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2597x" event={"ID":"bdf8f784-8094-4b1c-96bb-f7997430a0ea","Type":"ContainerStarted","Data":"ac6c5b95dd3f8abf1913560383842afa0ec933b15fe0a03ea5c8c553017e5bbb"} Mar 09 16:02:13 crc kubenswrapper[4831]: I0309 16:02:13.669781 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2597x" event={"ID":"bdf8f784-8094-4b1c-96bb-f7997430a0ea","Type":"ContainerStarted","Data":"0cfbf54bc0e9bbbc029bf754018ccd98fe1cb81edc52722294241edc7edc83fe"} Mar 09 16:02:17 crc kubenswrapper[4831]: I0309 16:02:17.439453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:02:17 crc kubenswrapper[4831]: I0309 16:02:17.444278 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wgqf6" Mar 09 16:02:17 crc kubenswrapper[4831]: I0309 16:02:17.459358 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2597x" podStartSLOduration=196.45933813 podStartE2EDuration="3m16.45933813s" podCreationTimestamp="2026-03-09 15:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:13.701678186 +0000 UTC m=+260.835360609" watchObservedRunningTime="2026-03-09 16:02:17.45933813 +0000 UTC m=+264.593020553" Mar 09 16:02:17 crc kubenswrapper[4831]: I0309 16:02:17.688957 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ntw62" Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.232525 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.233450 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" containerID="cri-o://4fc1742310c89bb623c402b801c203fc7ec20b6c05cdabcb0ad4dc481d388111" gracePeriod=30 Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.252100 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.253286 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" containerID="cri-o://42863b4845feb55005080656359d733d83aa858f9681c1a17cd9470a4e4d28ca" gracePeriod=30 Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.749184 4831 generic.go:334] "Generic (PLEG): container finished" podID="f748a5bd-955b-46cc-8953-414be09f5493" containerID="42863b4845feb55005080656359d733d83aa858f9681c1a17cd9470a4e4d28ca" exitCode=0 Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.749290 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" event={"ID":"f748a5bd-955b-46cc-8953-414be09f5493","Type":"ContainerDied","Data":"42863b4845feb55005080656359d733d83aa858f9681c1a17cd9470a4e4d28ca"} Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.751992 4831 generic.go:334] "Generic (PLEG): container finished" podID="2557efd7-1557-4555-a540-4d0e421f7140" containerID="4fc1742310c89bb623c402b801c203fc7ec20b6c05cdabcb0ad4dc481d388111" exitCode=0 Mar 09 16:02:22 crc kubenswrapper[4831]: I0309 16:02:22.752037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" event={"ID":"2557efd7-1557-4555-a540-4d0e421f7140","Type":"ContainerDied","Data":"4fc1742310c89bb623c402b801c203fc7ec20b6c05cdabcb0ad4dc481d388111"} Mar 09 16:02:25 crc kubenswrapper[4831]: I0309 16:02:25.317419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.087037 4831 patch_prober.go:28] interesting pod/route-controller-manager-76556f7bc-nr6w4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.087123 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.879369 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925113 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:26 crc kubenswrapper[4831]: E0309 16:02:26.925583 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925619 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: E0309 16:02:26.925645 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b4301d-b6b3-4ed5-b678-cce42b7e585e" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925654 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b4301d-b6b3-4ed5-b678-cce42b7e585e" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: E0309 16:02:26.925673 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c796eb3-c14d-48be-882d-5ae13e12918a" containerName="collect-profiles" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925683 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c796eb3-c14d-48be-882d-5ae13e12918a" containerName="collect-profiles" Mar 09 16:02:26 crc kubenswrapper[4831]: E0309 16:02:26.925701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925713 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925884 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f748a5bd-955b-46cc-8953-414be09f5493" containerName="route-controller-manager" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925897 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c796eb3-c14d-48be-882d-5ae13e12918a" containerName="collect-profiles" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925910 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d8b598-0b74-4ce6-ad15-8ca4c81fe509" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.925927 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b4301d-b6b3-4ed5-b678-cce42b7e585e" containerName="pruner" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.926621 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.935773 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert\") pod \"f748a5bd-955b-46cc-8953-414be09f5493\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993371 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config\") pod \"f748a5bd-955b-46cc-8953-414be09f5493\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993468 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca\") pod \"f748a5bd-955b-46cc-8953-414be09f5493\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993520 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llblc\" (UniqueName: \"kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc\") pod \"f748a5bd-955b-46cc-8953-414be09f5493\" (UID: \"f748a5bd-955b-46cc-8953-414be09f5493\") " Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzpv\" (UniqueName: \"kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.993971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.994440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca" (OuterVolumeSpecName: "client-ca") pod "f748a5bd-955b-46cc-8953-414be09f5493" (UID: "f748a5bd-955b-46cc-8953-414be09f5493"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.994455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config" (OuterVolumeSpecName: "config") pod "f748a5bd-955b-46cc-8953-414be09f5493" (UID: "f748a5bd-955b-46cc-8953-414be09f5493"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.998564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f748a5bd-955b-46cc-8953-414be09f5493" (UID: "f748a5bd-955b-46cc-8953-414be09f5493"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:26 crc kubenswrapper[4831]: I0309 16:02:26.998956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc" (OuterVolumeSpecName: "kube-api-access-llblc") pod "f748a5bd-955b-46cc-8953-414be09f5493" (UID: "f748a5bd-955b-46cc-8953-414be09f5493"). InnerVolumeSpecName "kube-api-access-llblc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzpv\" (UniqueName: \"kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095648 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095719 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095790 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095806 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llblc\" (UniqueName: \"kubernetes.io/projected/f748a5bd-955b-46cc-8953-414be09f5493-kube-api-access-llblc\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095823 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f748a5bd-955b-46cc-8953-414be09f5493-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.095837 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f748a5bd-955b-46cc-8953-414be09f5493-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.097876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.098501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.102655 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.115976 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzpv\" (UniqueName: \"kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv\") pod \"route-controller-manager-5cbddd8cbf-9gjxh\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.248955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.777556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" event={"ID":"f748a5bd-955b-46cc-8953-414be09f5493","Type":"ContainerDied","Data":"ca37defe7dfd0138093bcf0b5a8d7a84188ef5d24c7ca97ea94c5a8b42d92157"} Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.777610 4831 scope.go:117] "RemoveContainer" containerID="42863b4845feb55005080656359d733d83aa858f9681c1a17cd9470a4e4d28ca" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.777643 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4" Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.796931 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:27 crc kubenswrapper[4831]: I0309 16:02:27.800713 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556f7bc-nr6w4"] Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.258198 4831 patch_prober.go:28] interesting pod/controller-manager-66d8855b44-vlmkf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.258555 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.736884 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.783127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" event={"ID":"2557efd7-1557-4555-a540-4d0e421f7140","Type":"ContainerDied","Data":"0eaf3af1d3bd350465692cf303d5008bf78450dcec94ab1fabf82a42c4240390"} Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.783215 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d8855b44-vlmkf" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.818440 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert\") pod \"2557efd7-1557-4555-a540-4d0e421f7140\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.818525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles\") pod \"2557efd7-1557-4555-a540-4d0e421f7140\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.818558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjqgb\" (UniqueName: \"kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb\") pod \"2557efd7-1557-4555-a540-4d0e421f7140\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.818629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca\") pod \"2557efd7-1557-4555-a540-4d0e421f7140\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.818671 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config\") pod \"2557efd7-1557-4555-a540-4d0e421f7140\" (UID: \"2557efd7-1557-4555-a540-4d0e421f7140\") " Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.819809 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca" (OuterVolumeSpecName: "client-ca") pod "2557efd7-1557-4555-a540-4d0e421f7140" (UID: "2557efd7-1557-4555-a540-4d0e421f7140"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.819828 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2557efd7-1557-4555-a540-4d0e421f7140" (UID: "2557efd7-1557-4555-a540-4d0e421f7140"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.819935 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config" (OuterVolumeSpecName: "config") pod "2557efd7-1557-4555-a540-4d0e421f7140" (UID: "2557efd7-1557-4555-a540-4d0e421f7140"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.823711 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb" (OuterVolumeSpecName: "kube-api-access-pjqgb") pod "2557efd7-1557-4555-a540-4d0e421f7140" (UID: "2557efd7-1557-4555-a540-4d0e421f7140"). InnerVolumeSpecName "kube-api-access-pjqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.824473 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2557efd7-1557-4555-a540-4d0e421f7140" (UID: "2557efd7-1557-4555-a540-4d0e421f7140"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.920176 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjqgb\" (UniqueName: \"kubernetes.io/projected/2557efd7-1557-4555-a540-4d0e421f7140-kube-api-access-pjqgb\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.920233 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.920249 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.920262 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2557efd7-1557-4555-a540-4d0e421f7140-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:28 crc kubenswrapper[4831]: I0309 16:02:28.920274 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2557efd7-1557-4555-a540-4d0e421f7140-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.126245 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.131227 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66d8855b44-vlmkf"] Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.624013 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2557efd7-1557-4555-a540-4d0e421f7140" path="/var/lib/kubelet/pods/2557efd7-1557-4555-a540-4d0e421f7140/volumes" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.625307 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f748a5bd-955b-46cc-8953-414be09f5493" path="/var/lib/kubelet/pods/f748a5bd-955b-46cc-8953-414be09f5493/volumes" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.903606 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:29 crc kubenswrapper[4831]: E0309 16:02:29.903897 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.903913 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.904029 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2557efd7-1557-4555-a540-4d0e421f7140" containerName="controller-manager" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.904567 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.907817 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.908528 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.909473 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.909939 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.915389 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.916078 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.929770 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:02:29 crc kubenswrapper[4831]: I0309 16:02:29.934351 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.036631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.036700 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.036778 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft58m\" (UniqueName: \"kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.036867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.036897 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.138560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.138641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.138683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft58m\" (UniqueName: \"kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.138711 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.138727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.140056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.140950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.141012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.148755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.161817 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft58m\" (UniqueName: \"kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m\") pod \"controller-manager-6cd4c649f6-cdj7c\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: I0309 16:02:30.223986 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.608613 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.608865 4831 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 16:02:30 crc kubenswrapper[4831]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 16:02:30 crc kubenswrapper[4831]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8fjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551202-rgtkz_openshift-infra(33678e26-b1b2-419f-93ef-85ba9e935155): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 16:02:30 crc kubenswrapper[4831]: > logger="UnhandledError" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.610960 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.626534 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.626701 4831 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 16:02:30 crc kubenswrapper[4831]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 16:02:30 crc kubenswrapper[4831]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dqks8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551200-x7tx8_openshift-infra(a013f059-7440-4f06-88f6-a73f3286d228): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 16:02:30 crc kubenswrapper[4831]: > logger="UnhandledError" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.627974 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" podUID="a013f059-7440-4f06-88f6-a73f3286d228" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.799092 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" Mar 09 16:02:30 crc kubenswrapper[4831]: E0309 16:02:30.799155 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" podUID="a013f059-7440-4f06-88f6-a73f3286d228" Mar 09 16:02:33 crc kubenswrapper[4831]: I0309 16:02:33.018809 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:02:33 crc kubenswrapper[4831]: I0309 16:02:33.019894 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:02:33 crc kubenswrapper[4831]: I0309 16:02:33.176760 4831 ???:1] "http: TLS handshake error from 192.168.126.11:59778: no serving certificate available for the kubelet" Mar 09 16:02:33 crc kubenswrapper[4831]: I0309 16:02:33.363545 4831 scope.go:117] "RemoveContainer" containerID="4fc1742310c89bb623c402b801c203fc7ec20b6c05cdabcb0ad4dc481d388111" Mar 09 16:02:37 crc kubenswrapper[4831]: I0309 16:02:37.494738 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5f6d8" Mar 09 16:02:38 crc kubenswrapper[4831]: E0309 16:02:38.935288 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 16:02:38 crc kubenswrapper[4831]: E0309 16:02:38.935546 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c24jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hw7pt_openshift-marketplace(2325f3d2-538f-4529-ac16-4c7c81cd13e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:38 crc kubenswrapper[4831]: E0309 16:02:38.936756 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hw7pt" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.092078 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.092988 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.096926 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.096970 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.103211 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.199640 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.199709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.301022 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.301073 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.301231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.324325 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:40 crc kubenswrapper[4831]: I0309 16:02:40.462193 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.356385 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hw7pt" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.454074 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.454505 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57kkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gbpnl_openshift-marketplace(1ced3eb3-5570-485a-9828-4c509ecd19f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.455690 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gbpnl" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.504410 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.504565 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4bw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-d4629_openshift-marketplace(1cdec29c-2ba7-47b6-9446-95791b883267): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.505763 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-d4629" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.575573 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.575721 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwmr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gqxw6_openshift-marketplace(b8ed410a-1efe-4e39-853c-f87a9dc04437): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:41 crc kubenswrapper[4831]: E0309 16:02:41.576879 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gqxw6" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" Mar 09 16:02:42 crc kubenswrapper[4831]: I0309 16:02:42.245528 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:42 crc kubenswrapper[4831]: I0309 16:02:42.335511 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.098492 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gbpnl" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.098525 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-d4629" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.098525 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gqxw6" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.191878 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.192039 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c27cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cn4z8_openshift-marketplace(3d0e6234-ab07-440a-8926-925d66e3ba7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.193236 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cn4z8" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.211202 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.211363 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf6lf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5rzt7_openshift-marketplace(93ec285c-3738-4b32-b6fc-abdf28c52c55): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 16:02:43 crc kubenswrapper[4831]: E0309 16:02:43.218978 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5rzt7" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.426592 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 16:02:43 crc kubenswrapper[4831]: W0309 16:02:43.434870 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod306753e1_a72f_4509_ac89_68b8d87107f3.slice/crio-86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78 WatchSource:0}: Error finding container 86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78: Status 404 returned error can't find the container with id 86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78 Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.465588 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:43 crc kubenswrapper[4831]: W0309 16:02:43.471369 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode338b6b7_8a29_493e_91ef_2d1c6af1224b.slice/crio-70aaf1ec6c0217ac24f2d48a4e4264c272056adc1ef0225a36f67f6942d2f8db WatchSource:0}: Error finding container 70aaf1ec6c0217ac24f2d48a4e4264c272056adc1ef0225a36f67f6942d2f8db: Status 404 returned error can't find the container with id 70aaf1ec6c0217ac24f2d48a4e4264c272056adc1ef0225a36f67f6942d2f8db Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.561113 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.912233 4831 generic.go:334] "Generic (PLEG): container finished" podID="44dba940-7ade-48aa-91e5-a358ac696126" containerID="756416447f5f3c2658dff1037ce89876436dadc185adee582083b1bf1d00d1e7" exitCode=0 Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.912620 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerDied","Data":"756416447f5f3c2658dff1037ce89876436dadc185adee582083b1bf1d00d1e7"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.927724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" event={"ID":"e338b6b7-8a29-493e-91ef-2d1c6af1224b","Type":"ContainerStarted","Data":"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.927778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" event={"ID":"e338b6b7-8a29-493e-91ef-2d1c6af1224b","Type":"ContainerStarted","Data":"70aaf1ec6c0217ac24f2d48a4e4264c272056adc1ef0225a36f67f6942d2f8db"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.927937 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" podUID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" containerName="controller-manager" containerID="cri-o://08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7" gracePeriod=30 Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.928679 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.942431 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.965555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"306753e1-a72f-4509-ac89-68b8d87107f3","Type":"ContainerStarted","Data":"875d9a00fb090814d65982783eff0b01325a328f3ba4745461ff49b24627ffc5"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.965620 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"306753e1-a72f-4509-ac89-68b8d87107f3","Type":"ContainerStarted","Data":"86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.976617 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerName="route-controller-manager" containerID="cri-o://c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d" gracePeriod=30 Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.976711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" event={"ID":"bf0ffd4d-3359-42b9-be25-fc5c86289179","Type":"ContainerStarted","Data":"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.976733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" event={"ID":"bf0ffd4d-3359-42b9-be25-fc5c86289179","Type":"ContainerStarted","Data":"8cfb046804ce72eee6c4b65b64f46cf32f8dc866a70b644ab98bee76dd3e5045"} Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.976917 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.982786 4831 patch_prober.go:28] interesting pod/route-controller-manager-5cbddd8cbf-9gjxh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 09 16:02:43 crc kubenswrapper[4831]: I0309 16:02:43.982834 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.004606 4831 generic.go:334] "Generic (PLEG): container finished" podID="8980d825-3bbd-4832-807a-d55b47c20e18" containerID="0077f6b2c7dac7711c290b80d5c459f727fcbf1aaa7c12edf5245703d7170014" exitCode=0 Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.004781 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerDied","Data":"0077f6b2c7dac7711c290b80d5c459f727fcbf1aaa7c12edf5245703d7170014"} Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.015778 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" podStartSLOduration=22.015755202 podStartE2EDuration="22.015755202s" podCreationTimestamp="2026-03-09 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:44.011608296 +0000 UTC m=+291.145290719" watchObservedRunningTime="2026-03-09 16:02:44.015755202 +0000 UTC m=+291.149437625" Mar 09 16:02:44 crc kubenswrapper[4831]: E0309 16:02:44.027050 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cn4z8" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" Mar 09 16:02:44 crc kubenswrapper[4831]: E0309 16:02:44.027639 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5rzt7" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.054278 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" podStartSLOduration=22.054248182 podStartE2EDuration="22.054248182s" podCreationTimestamp="2026-03-09 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:44.044864709 +0000 UTC m=+291.178547132" watchObservedRunningTime="2026-03-09 16:02:44.054248182 +0000 UTC m=+291.187930605" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.081690 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.081667021 podStartE2EDuration="4.081667021s" podCreationTimestamp="2026-03-09 16:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:44.076331692 +0000 UTC m=+291.210014115" watchObservedRunningTime="2026-03-09 16:02:44.081667021 +0000 UTC m=+291.215349444" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.316597 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.348455 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:02:44 crc kubenswrapper[4831]: E0309 16:02:44.349126 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" containerName="controller-manager" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.349146 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" containerName="controller-manager" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.349269 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" containerName="controller-manager" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.349735 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.359623 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.372682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca\") pod \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.372832 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles\") pod \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.372894 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft58m\" (UniqueName: \"kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m\") pod \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.372939 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert\") pod \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.372977 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config\") pod \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\" (UID: \"e338b6b7-8a29-493e-91ef-2d1c6af1224b\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.373646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e338b6b7-8a29-493e-91ef-2d1c6af1224b" (UID: "e338b6b7-8a29-493e-91ef-2d1c6af1224b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.373667 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e338b6b7-8a29-493e-91ef-2d1c6af1224b" (UID: "e338b6b7-8a29-493e-91ef-2d1c6af1224b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.374451 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config" (OuterVolumeSpecName: "config") pod "e338b6b7-8a29-493e-91ef-2d1c6af1224b" (UID: "e338b6b7-8a29-493e-91ef-2d1c6af1224b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.376159 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5cbddd8cbf-9gjxh_bf0ffd4d-3359-42b9-be25-fc5c86289179/route-controller-manager/0.log" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.376226 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.380232 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e338b6b7-8a29-493e-91ef-2d1c6af1224b" (UID: "e338b6b7-8a29-493e-91ef-2d1c6af1224b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.380522 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m" (OuterVolumeSpecName: "kube-api-access-ft58m") pod "e338b6b7-8a29-493e-91ef-2d1c6af1224b" (UID: "e338b6b7-8a29-493e-91ef-2d1c6af1224b"). InnerVolumeSpecName "kube-api-access-ft58m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca\") pod \"bf0ffd4d-3359-42b9-be25-fc5c86289179\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473656 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config\") pod \"bf0ffd4d-3359-42b9-be25-fc5c86289179\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qzpv\" (UniqueName: \"kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv\") pod \"bf0ffd4d-3359-42b9-be25-fc5c86289179\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert\") pod \"bf0ffd4d-3359-42b9-be25-fc5c86289179\" (UID: \"bf0ffd4d-3359-42b9-be25-fc5c86289179\") " Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473905 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.473972 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzsh\" (UniqueName: \"kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474107 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474121 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474132 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e338b6b7-8a29-493e-91ef-2d1c6af1224b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474141 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft58m\" (UniqueName: \"kubernetes.io/projected/e338b6b7-8a29-493e-91ef-2d1c6af1224b-kube-api-access-ft58m\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474150 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e338b6b7-8a29-493e-91ef-2d1c6af1224b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.474357 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf0ffd4d-3359-42b9-be25-fc5c86289179" (UID: "bf0ffd4d-3359-42b9-be25-fc5c86289179"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.475127 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config" (OuterVolumeSpecName: "config") pod "bf0ffd4d-3359-42b9-be25-fc5c86289179" (UID: "bf0ffd4d-3359-42b9-be25-fc5c86289179"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.480270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf0ffd4d-3359-42b9-be25-fc5c86289179" (UID: "bf0ffd4d-3359-42b9-be25-fc5c86289179"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.480526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv" (OuterVolumeSpecName: "kube-api-access-4qzpv") pod "bf0ffd4d-3359-42b9-be25-fc5c86289179" (UID: "bf0ffd4d-3359-42b9-be25-fc5c86289179"). InnerVolumeSpecName "kube-api-access-4qzpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575267 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzsh\" (UniqueName: \"kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575339 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575445 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0ffd4d-3359-42b9-be25-fc5c86289179-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575461 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575473 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0ffd4d-3359-42b9-be25-fc5c86289179-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.575484 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qzpv\" (UniqueName: \"kubernetes.io/projected/bf0ffd4d-3359-42b9-be25-fc5c86289179-kube-api-access-4qzpv\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.584249 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.585907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.593231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.593747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.603752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzsh\" (UniqueName: \"kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh\") pod \"controller-manager-685f54fcc4-gt2kw\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.690895 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:44 crc kubenswrapper[4831]: I0309 16:02:44.900109 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:02:44 crc kubenswrapper[4831]: W0309 16:02:44.906762 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11384b3_cbaf_411e_93d2_7faeb4a8579a.slice/crio-4301be18eb4ebe54ae06a26be007405e904fd9cfac0291dc7e6bda85228f6432 WatchSource:0}: Error finding container 4301be18eb4ebe54ae06a26be007405e904fd9cfac0291dc7e6bda85228f6432: Status 404 returned error can't find the container with id 4301be18eb4ebe54ae06a26be007405e904fd9cfac0291dc7e6bda85228f6432 Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.034496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerStarted","Data":"9af31752a1985aca37c688e00ea1d03a29a02bba476d94064d20668f83e1ad13"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.040259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerStarted","Data":"5a08315e716b902ce353dbbe3ddb0abe6714d139864b939d28443f59e4cc904f"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.041868 4831 generic.go:334] "Generic (PLEG): container finished" podID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" containerID="08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7" exitCode=0 Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.041977 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" event={"ID":"e338b6b7-8a29-493e-91ef-2d1c6af1224b","Type":"ContainerDied","Data":"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.042007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" event={"ID":"e338b6b7-8a29-493e-91ef-2d1c6af1224b","Type":"ContainerDied","Data":"70aaf1ec6c0217ac24f2d48a4e4264c272056adc1ef0225a36f67f6942d2f8db"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.042026 4831 scope.go:117] "RemoveContainer" containerID="08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.042125 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.062459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" event={"ID":"d11384b3-cbaf-411e-93d2-7faeb4a8579a","Type":"ContainerStarted","Data":"4301be18eb4ebe54ae06a26be007405e904fd9cfac0291dc7e6bda85228f6432"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.066674 4831 generic.go:334] "Generic (PLEG): container finished" podID="306753e1-a72f-4509-ac89-68b8d87107f3" containerID="875d9a00fb090814d65982783eff0b01325a328f3ba4745461ff49b24627ffc5" exitCode=0 Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.066873 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"306753e1-a72f-4509-ac89-68b8d87107f3","Type":"ContainerDied","Data":"875d9a00fb090814d65982783eff0b01325a328f3ba4745461ff49b24627ffc5"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.074370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-blz27" podStartSLOduration=3.038716142 podStartE2EDuration="38.074353592s" podCreationTimestamp="2026-03-09 16:02:07 +0000 UTC" firstStartedPulling="2026-03-09 16:02:09.464871162 +0000 UTC m=+256.598553595" lastFinishedPulling="2026-03-09 16:02:44.500508622 +0000 UTC m=+291.634191045" observedRunningTime="2026-03-09 16:02:45.052733076 +0000 UTC m=+292.186415499" watchObservedRunningTime="2026-03-09 16:02:45.074353592 +0000 UTC m=+292.208036015" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.075856 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5cbddd8cbf-9gjxh_bf0ffd4d-3359-42b9-be25-fc5c86289179/route-controller-manager/0.log" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.075985 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerID="c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d" exitCode=2 Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.076079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" event={"ID":"bf0ffd4d-3359-42b9-be25-fc5c86289179","Type":"ContainerDied","Data":"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.076161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" event={"ID":"bf0ffd4d-3359-42b9-be25-fc5c86289179","Type":"ContainerDied","Data":"8cfb046804ce72eee6c4b65b64f46cf32f8dc866a70b644ab98bee76dd3e5045"} Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.076274 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.085613 4831 scope.go:117] "RemoveContainer" containerID="08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7" Mar 09 16:02:45 crc kubenswrapper[4831]: E0309 16:02:45.087167 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7\": container with ID starting with 08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7 not found: ID does not exist" containerID="08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.087200 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7"} err="failed to get container status \"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7\": rpc error: code = NotFound desc = could not find container \"08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7\": container with ID starting with 08c6a1eb34c7b66dc4053d7023c4adfa51d09539dcb81910b9831ca7efc6a3e7 not found: ID does not exist" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.087224 4831 scope.go:117] "RemoveContainer" containerID="c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.094162 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9rfth" podStartSLOduration=3.182363441 podStartE2EDuration="38.094136087s" podCreationTimestamp="2026-03-09 16:02:07 +0000 UTC" firstStartedPulling="2026-03-09 16:02:09.477884946 +0000 UTC m=+256.611567369" lastFinishedPulling="2026-03-09 16:02:44.389657592 +0000 UTC m=+291.523340015" observedRunningTime="2026-03-09 16:02:45.086950776 +0000 UTC m=+292.220633199" watchObservedRunningTime="2026-03-09 16:02:45.094136087 +0000 UTC m=+292.227818510" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.120940 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.124860 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cd4c649f6-cdj7c"] Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.132138 4831 scope.go:117] "RemoveContainer" containerID="c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d" Mar 09 16:02:45 crc kubenswrapper[4831]: E0309 16:02:45.134183 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d\": container with ID starting with c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d not found: ID does not exist" containerID="c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.135048 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d"} err="failed to get container status \"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d\": rpc error: code = NotFound desc = could not find container \"c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d\": container with ID starting with c5b168c1f6ab881096b7e86670d913e49f43546dc937a211b25caf922f78df2d not found: ID does not exist" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.167378 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.173003 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbddd8cbf-9gjxh"] Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.628307 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" path="/var/lib/kubelet/pods/bf0ffd4d-3359-42b9-be25-fc5c86289179/volumes" Mar 09 16:02:45 crc kubenswrapper[4831]: I0309 16:02:45.629305 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e338b6b7-8a29-493e-91ef-2d1c6af1224b" path="/var/lib/kubelet/pods/e338b6b7-8a29-493e-91ef-2d1c6af1224b/volumes" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.090833 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" event={"ID":"a013f059-7440-4f06-88f6-a73f3286d228","Type":"ContainerStarted","Data":"f0d6460c1557da868152743db7dc7bd72657fe2e1045b75281d92f71d0fa9b65"} Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.094152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" event={"ID":"d11384b3-cbaf-411e-93d2-7faeb4a8579a","Type":"ContainerStarted","Data":"35d2ffe63676accf66bbbfc1ef0a3881d0cb21649801cccc4543d2a7586ad198"} Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.094338 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.103797 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.138207 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" podStartSLOduration=4.138187519 podStartE2EDuration="4.138187519s" podCreationTimestamp="2026-03-09 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:46.128357343 +0000 UTC m=+293.262039766" watchObservedRunningTime="2026-03-09 16:02:46.138187519 +0000 UTC m=+293.271869942" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.139442 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" podStartSLOduration=120.997467969 podStartE2EDuration="2m46.139433674s" podCreationTimestamp="2026-03-09 16:00:00 +0000 UTC" firstStartedPulling="2026-03-09 16:02:00.364216113 +0000 UTC m=+247.497898536" lastFinishedPulling="2026-03-09 16:02:45.506181818 +0000 UTC m=+292.639864241" observedRunningTime="2026-03-09 16:02:46.1111128 +0000 UTC m=+293.244795223" watchObservedRunningTime="2026-03-09 16:02:46.139433674 +0000 UTC m=+293.273116087" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.376007 4831 csr.go:261] certificate signing request csr-pbg94 is approved, waiting to be issued Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.383221 4831 csr.go:257] certificate signing request csr-pbg94 is issued Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.389974 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.501808 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir\") pod \"306753e1-a72f-4509-ac89-68b8d87107f3\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.501910 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access\") pod \"306753e1-a72f-4509-ac89-68b8d87107f3\" (UID: \"306753e1-a72f-4509-ac89-68b8d87107f3\") " Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.501932 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "306753e1-a72f-4509-ac89-68b8d87107f3" (UID: "306753e1-a72f-4509-ac89-68b8d87107f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.502134 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/306753e1-a72f-4509-ac89-68b8d87107f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.512209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "306753e1-a72f-4509-ac89-68b8d87107f3" (UID: "306753e1-a72f-4509-ac89-68b8d87107f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.603735 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/306753e1-a72f-4509-ac89-68b8d87107f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916245 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:02:46 crc kubenswrapper[4831]: E0309 16:02:46.916473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerName="route-controller-manager" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916484 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerName="route-controller-manager" Mar 09 16:02:46 crc kubenswrapper[4831]: E0309 16:02:46.916501 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306753e1-a72f-4509-ac89-68b8d87107f3" containerName="pruner" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916508 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="306753e1-a72f-4509-ac89-68b8d87107f3" containerName="pruner" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916608 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0ffd4d-3359-42b9-be25-fc5c86289179" containerName="route-controller-manager" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916622 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="306753e1-a72f-4509-ac89-68b8d87107f3" containerName="pruner" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.916988 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.919554 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.919917 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.920237 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.920267 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.920652 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.921280 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 16:02:46 crc kubenswrapper[4831]: I0309 16:02:46.931124 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.008611 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmq7\" (UniqueName: \"kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.008680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.008726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.008795 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.103585 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"306753e1-a72f-4509-ac89-68b8d87107f3","Type":"ContainerDied","Data":"86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78"} Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.103904 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86ba76eaad8a3f54895905c3dc67e4adfaef112fcda8932f68f9fee68916eb78" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.103651 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.105344 4831 generic.go:334] "Generic (PLEG): container finished" podID="a013f059-7440-4f06-88f6-a73f3286d228" containerID="f0d6460c1557da868152743db7dc7bd72657fe2e1045b75281d92f71d0fa9b65" exitCode=0 Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.105433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" event={"ID":"a013f059-7440-4f06-88f6-a73f3286d228","Type":"ContainerDied","Data":"f0d6460c1557da868152743db7dc7bd72657fe2e1045b75281d92f71d0fa9b65"} Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.107229 4831 generic.go:334] "Generic (PLEG): container finished" podID="33678e26-b1b2-419f-93ef-85ba9e935155" containerID="533d2358e13f31f47eafe8b323538343a36cfe938cd4e76fafc31a237d2bc94e" exitCode=0 Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.107313 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" event={"ID":"33678e26-b1b2-419f-93ef-85ba9e935155","Type":"ContainerDied","Data":"533d2358e13f31f47eafe8b323538343a36cfe938cd4e76fafc31a237d2bc94e"} Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.109633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.109701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmq7\" (UniqueName: \"kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.109728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.109754 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.110712 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.111274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.118329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.129334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmq7\" (UniqueName: \"kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7\") pod \"route-controller-manager-5766d5bff4-jtjpc\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.274985 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.299759 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.300844 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.303614 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.304039 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.304221 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.385141 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 06:02:17.074665911 +0000 UTC Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.385176 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6805h59m29.689492187s for next certificate rotation Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.422961 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.423298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.423531 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.512909 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.525068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.525137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.525170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.525441 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.525564 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.546844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.652538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.767230 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.769701 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:47 crc kubenswrapper[4831]: I0309 16:02:47.864079 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.075694 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.121538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" event={"ID":"7f19dba3-76b9-4f0d-9f09-cb63843578d2","Type":"ContainerStarted","Data":"02386fcc4d1e1acc7bc81c60fce826ea32e80a69130eda9acc66a2773766abdb"} Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.121598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" event={"ID":"7f19dba3-76b9-4f0d-9f09-cb63843578d2","Type":"ContainerStarted","Data":"293a42f05fbeaa26838a9185ce582bf1c326a353ff70ae371783dc055f1c7508"} Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.122649 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.126228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb770b7c-9f0b-4e5e-a912-f34519e56e13","Type":"ContainerStarted","Data":"e67a77d8df618ea70c7921b4949ea474e4424c993235c21b10ba132ae6aa34dd"} Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.226620 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.227027 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.240777 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.267439 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" podStartSLOduration=6.267403776 podStartE2EDuration="6.267403776s" podCreationTimestamp="2026-03-09 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:48.144710253 +0000 UTC m=+295.278392696" watchObservedRunningTime="2026-03-09 16:02:48.267403776 +0000 UTC m=+295.401086199" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.300453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.385551 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 14:49:32.748662701 +0000 UTC Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.385588 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7294h46m44.363077209s for next certificate rotation Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.394717 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.438067 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8fjm\" (UniqueName: \"kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm\") pod \"33678e26-b1b2-419f-93ef-85ba9e935155\" (UID: \"33678e26-b1b2-419f-93ef-85ba9e935155\") " Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.444037 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm" (OuterVolumeSpecName: "kube-api-access-g8fjm") pod "33678e26-b1b2-419f-93ef-85ba9e935155" (UID: "33678e26-b1b2-419f-93ef-85ba9e935155"). InnerVolumeSpecName "kube-api-access-g8fjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.493964 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.539283 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqks8\" (UniqueName: \"kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8\") pod \"a013f059-7440-4f06-88f6-a73f3286d228\" (UID: \"a013f059-7440-4f06-88f6-a73f3286d228\") " Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.539556 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8fjm\" (UniqueName: \"kubernetes.io/projected/33678e26-b1b2-419f-93ef-85ba9e935155-kube-api-access-g8fjm\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.544837 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8" (OuterVolumeSpecName: "kube-api-access-dqks8") pod "a013f059-7440-4f06-88f6-a73f3286d228" (UID: "a013f059-7440-4f06-88f6-a73f3286d228"). InnerVolumeSpecName "kube-api-access-dqks8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:48 crc kubenswrapper[4831]: I0309 16:02:48.640687 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqks8\" (UniqueName: \"kubernetes.io/projected/a013f059-7440-4f06-88f6-a73f3286d228-kube-api-access-dqks8\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.133346 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" event={"ID":"33678e26-b1b2-419f-93ef-85ba9e935155","Type":"ContainerDied","Data":"9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46"} Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.133389 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b4b5a0e48e13ff3d9a27bfdaa10cf3455cbac202e9160e25300ae07d822ff46" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.133422 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551202-rgtkz" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.139470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" event={"ID":"a013f059-7440-4f06-88f6-a73f3286d228","Type":"ContainerDied","Data":"b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9"} Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.139518 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e2f88c5e35c01ac6509b71ad7d8302a039d9d7b1aa0d7f31d0f97792cecab9" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.139789 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551200-x7tx8" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.141904 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb770b7c-9f0b-4e5e-a912-f34519e56e13","Type":"ContainerStarted","Data":"a2a99e375e224dd2c4b1bccf69cf2d00472d6e1564d79f2dc720b4d77ed92f02"} Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.169645 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.169611888 podStartE2EDuration="2.169611888s" podCreationTimestamp="2026-03-09 16:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:02:49.158745163 +0000 UTC m=+296.292427576" watchObservedRunningTime="2026-03-09 16:02:49.169611888 +0000 UTC m=+296.303294301" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.193111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:02:49 crc kubenswrapper[4831]: I0309 16:02:49.194190 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:50 crc kubenswrapper[4831]: I0309 16:02:50.109198 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:50 crc kubenswrapper[4831]: I0309 16:02:50.801957 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 16:02:51 crc kubenswrapper[4831]: I0309 16:02:51.153602 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-blz27" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="registry-server" containerID="cri-o://9af31752a1985aca37c688e00ea1d03a29a02bba476d94064d20668f83e1ad13" gracePeriod=2 Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.162033 4831 generic.go:334] "Generic (PLEG): container finished" podID="8980d825-3bbd-4832-807a-d55b47c20e18" containerID="9af31752a1985aca37c688e00ea1d03a29a02bba476d94064d20668f83e1ad13" exitCode=0 Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.162084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerDied","Data":"9af31752a1985aca37c688e00ea1d03a29a02bba476d94064d20668f83e1ad13"} Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.331471 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.447212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content\") pod \"8980d825-3bbd-4832-807a-d55b47c20e18\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.447292 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmll4\" (UniqueName: \"kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4\") pod \"8980d825-3bbd-4832-807a-d55b47c20e18\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.447356 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities\") pod \"8980d825-3bbd-4832-807a-d55b47c20e18\" (UID: \"8980d825-3bbd-4832-807a-d55b47c20e18\") " Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.448249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities" (OuterVolumeSpecName: "utilities") pod "8980d825-3bbd-4832-807a-d55b47c20e18" (UID: "8980d825-3bbd-4832-807a-d55b47c20e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.452668 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4" (OuterVolumeSpecName: "kube-api-access-jmll4") pod "8980d825-3bbd-4832-807a-d55b47c20e18" (UID: "8980d825-3bbd-4832-807a-d55b47c20e18"). InnerVolumeSpecName "kube-api-access-jmll4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.485560 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8980d825-3bbd-4832-807a-d55b47c20e18" (UID: "8980d825-3bbd-4832-807a-d55b47c20e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.548865 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.548911 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmll4\" (UniqueName: \"kubernetes.io/projected/8980d825-3bbd-4832-807a-d55b47c20e18-kube-api-access-jmll4\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:52 crc kubenswrapper[4831]: I0309 16:02:52.548923 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8980d825-3bbd-4832-807a-d55b47c20e18-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.172676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-blz27" event={"ID":"8980d825-3bbd-4832-807a-d55b47c20e18","Type":"ContainerDied","Data":"775422565a9fdda2adc420763238f6c1fa0c7757174e5d7369a8431d04c2636a"} Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.172741 4831 scope.go:117] "RemoveContainer" containerID="9af31752a1985aca37c688e00ea1d03a29a02bba476d94064d20668f83e1ad13" Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.172830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-blz27" Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.193667 4831 scope.go:117] "RemoveContainer" containerID="0077f6b2c7dac7711c290b80d5c459f727fcbf1aaa7c12edf5245703d7170014" Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.216814 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.223663 4831 scope.go:117] "RemoveContainer" containerID="468cbd5336e4c969443440f0347f4afce29f43ee0a144f0c9547a10bd9e8ae54" Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.224515 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-blz27"] Mar 09 16:02:53 crc kubenswrapper[4831]: I0309 16:02:53.629007 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" path="/var/lib/kubelet/pods/8980d825-3bbd-4832-807a-d55b47c20e18/volumes" Mar 09 16:02:56 crc kubenswrapper[4831]: I0309 16:02:56.192045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerStarted","Data":"981e9d8def2c2bff2d2987a1b189a7f7778b17162e0e031cbeda4e2670c0f653"} Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.203276 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerID="af7c6260d816052121f8c27f4c8f8431bc091c7268e5e6d517400d383bbe76db" exitCode=0 Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.203380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerDied","Data":"af7c6260d816052121f8c27f4c8f8431bc091c7268e5e6d517400d383bbe76db"} Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.208359 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerID="3c42e78670ae015904a53fe22b905f570cc44d16297721d3102a9ab581658393" exitCode=0 Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.208479 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerDied","Data":"3c42e78670ae015904a53fe22b905f570cc44d16297721d3102a9ab581658393"} Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.233127 4831 generic.go:334] "Generic (PLEG): container finished" podID="1cdec29c-2ba7-47b6-9446-95791b883267" containerID="981e9d8def2c2bff2d2987a1b189a7f7778b17162e0e031cbeda4e2670c0f653" exitCode=0 Mar 09 16:02:57 crc kubenswrapper[4831]: I0309 16:02:57.233168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerDied","Data":"981e9d8def2c2bff2d2987a1b189a7f7778b17162e0e031cbeda4e2670c0f653"} Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.240974 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerStarted","Data":"7bbf68197dbce2e98c0fcc985a29ab3dde6c5630c2745a5ec6b958a3dccf6a93"} Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.242556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerStarted","Data":"5e7b3dee6c500e08dd9cdff87e2b25f7025df0d338346566011bad68c06e8ce9"} Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.244211 4831 generic.go:334] "Generic (PLEG): container finished" podID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerID="041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258" exitCode=0 Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.244276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerDied","Data":"041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258"} Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.246816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerStarted","Data":"106b83e181b48b74e554e080a94fb5945d6e88b33a1e018d79139b019c59c264"} Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.262784 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d4629" podStartSLOduration=3.118298803 podStartE2EDuration="50.262766464s" podCreationTimestamp="2026-03-09 16:02:08 +0000 UTC" firstStartedPulling="2026-03-09 16:02:10.517506038 +0000 UTC m=+257.651188471" lastFinishedPulling="2026-03-09 16:02:57.661973669 +0000 UTC m=+304.795656132" observedRunningTime="2026-03-09 16:02:58.261758715 +0000 UTC m=+305.395441148" watchObservedRunningTime="2026-03-09 16:02:58.262766464 +0000 UTC m=+305.396448887" Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.287104 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cn4z8" podStartSLOduration=2.7555390429999997 podStartE2EDuration="53.287087106s" podCreationTimestamp="2026-03-09 16:02:05 +0000 UTC" firstStartedPulling="2026-03-09 16:02:07.206310974 +0000 UTC m=+254.339993397" lastFinishedPulling="2026-03-09 16:02:57.737859027 +0000 UTC m=+304.871541460" observedRunningTime="2026-03-09 16:02:58.283243968 +0000 UTC m=+305.416926391" watchObservedRunningTime="2026-03-09 16:02:58.287087106 +0000 UTC m=+305.420769529" Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.322177 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqxw6" podStartSLOduration=2.860800631 podStartE2EDuration="53.32215081s" podCreationTimestamp="2026-03-09 16:02:05 +0000 UTC" firstStartedPulling="2026-03-09 16:02:07.284460861 +0000 UTC m=+254.418143284" lastFinishedPulling="2026-03-09 16:02:57.74581104 +0000 UTC m=+304.879493463" observedRunningTime="2026-03-09 16:02:58.320686489 +0000 UTC m=+305.454368912" watchObservedRunningTime="2026-03-09 16:02:58.32215081 +0000 UTC m=+305.455833233" Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.905157 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:58 crc kubenswrapper[4831]: I0309 16:02:58.905211 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:02:59 crc kubenswrapper[4831]: I0309 16:02:59.254176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerStarted","Data":"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8"} Mar 09 16:02:59 crc kubenswrapper[4831]: I0309 16:02:59.255913 4831 generic.go:334] "Generic (PLEG): container finished" podID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerID="8218ee1072dd7a15964d3371aafb4c61a4a06682deef4d8139e7aff6eed285a4" exitCode=0 Mar 09 16:02:59 crc kubenswrapper[4831]: I0309 16:02:59.256695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerDied","Data":"8218ee1072dd7a15964d3371aafb4c61a4a06682deef4d8139e7aff6eed285a4"} Mar 09 16:02:59 crc kubenswrapper[4831]: I0309 16:02:59.279522 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hw7pt" podStartSLOduration=3.069187837 podStartE2EDuration="51.279503799s" podCreationTimestamp="2026-03-09 16:02:08 +0000 UTC" firstStartedPulling="2026-03-09 16:02:10.510643747 +0000 UTC m=+257.644326170" lastFinishedPulling="2026-03-09 16:02:58.720959709 +0000 UTC m=+305.854642132" observedRunningTime="2026-03-09 16:02:59.27916186 +0000 UTC m=+306.412844293" watchObservedRunningTime="2026-03-09 16:02:59.279503799 +0000 UTC m=+306.413186222" Mar 09 16:02:59 crc kubenswrapper[4831]: I0309 16:02:59.941383 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4629" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="registry-server" probeResult="failure" output=< Mar 09 16:02:59 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:02:59 crc kubenswrapper[4831]: > Mar 09 16:03:00 crc kubenswrapper[4831]: I0309 16:03:00.267959 4831 generic.go:334] "Generic (PLEG): container finished" podID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerID="a4e66708b18d563c97a50d20b20227e74d862a5c3a95746a75eba0032126c75b" exitCode=0 Mar 09 16:03:00 crc kubenswrapper[4831]: I0309 16:03:00.268009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerDied","Data":"a4e66708b18d563c97a50d20b20227e74d862a5c3a95746a75eba0032126c75b"} Mar 09 16:03:01 crc kubenswrapper[4831]: I0309 16:03:01.276196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerStarted","Data":"7039dd4fab36a97b97ac50d9e38551bc0c7b458e5a6489bdcb23d1dd52b541e7"} Mar 09 16:03:01 crc kubenswrapper[4831]: I0309 16:03:01.278687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerStarted","Data":"3178ecacf213c3dee197d8de880c32dd60b61e115c399d5ad5e82b068e2abb77"} Mar 09 16:03:01 crc kubenswrapper[4831]: I0309 16:03:01.301781 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5rzt7" podStartSLOduration=2.823613655 podStartE2EDuration="56.301766486s" podCreationTimestamp="2026-03-09 16:02:05 +0000 UTC" firstStartedPulling="2026-03-09 16:02:07.204815104 +0000 UTC m=+254.338497527" lastFinishedPulling="2026-03-09 16:03:00.682967935 +0000 UTC m=+307.816650358" observedRunningTime="2026-03-09 16:03:01.299916464 +0000 UTC m=+308.433598897" watchObservedRunningTime="2026-03-09 16:03:01.301766486 +0000 UTC m=+308.435448909" Mar 09 16:03:01 crc kubenswrapper[4831]: I0309 16:03:01.323791 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gbpnl" podStartSLOduration=3.262635604 podStartE2EDuration="56.323775762s" podCreationTimestamp="2026-03-09 16:02:05 +0000 UTC" firstStartedPulling="2026-03-09 16:02:07.29614749 +0000 UTC m=+254.429829913" lastFinishedPulling="2026-03-09 16:03:00.357287628 +0000 UTC m=+307.490970071" observedRunningTime="2026-03-09 16:03:01.32049771 +0000 UTC m=+308.454180143" watchObservedRunningTime="2026-03-09 16:03:01.323775762 +0000 UTC m=+308.457458185" Mar 09 16:03:02 crc kubenswrapper[4831]: I0309 16:03:02.282102 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:03:02 crc kubenswrapper[4831]: I0309 16:03:02.282800 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" podUID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" containerName="controller-manager" containerID="cri-o://35d2ffe63676accf66bbbfc1ef0a3881d0cb21649801cccc4543d2a7586ad198" gracePeriod=30 Mar 09 16:03:02 crc kubenswrapper[4831]: I0309 16:03:02.311923 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:03:02 crc kubenswrapper[4831]: I0309 16:03:02.312232 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" podUID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" containerName="route-controller-manager" containerID="cri-o://02386fcc4d1e1acc7bc81c60fce826ea32e80a69130eda9acc66a2773766abdb" gracePeriod=30 Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.018688 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.018746 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.018787 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.019434 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.019491 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7" gracePeriod=600 Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.296713 4831 generic.go:334] "Generic (PLEG): container finished" podID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" containerID="35d2ffe63676accf66bbbfc1ef0a3881d0cb21649801cccc4543d2a7586ad198" exitCode=0 Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.296836 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" event={"ID":"d11384b3-cbaf-411e-93d2-7faeb4a8579a","Type":"ContainerDied","Data":"35d2ffe63676accf66bbbfc1ef0a3881d0cb21649801cccc4543d2a7586ad198"} Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.300217 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7" exitCode=0 Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.300292 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7"} Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.301481 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" containerID="02386fcc4d1e1acc7bc81c60fce826ea32e80a69130eda9acc66a2773766abdb" exitCode=0 Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.301509 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" event={"ID":"7f19dba3-76b9-4f0d-9f09-cb63843578d2","Type":"ContainerDied","Data":"02386fcc4d1e1acc7bc81c60fce826ea32e80a69130eda9acc66a2773766abdb"} Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.352665 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.380900 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381166 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="registry-server" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381184 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="registry-server" Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381200 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" containerName="route-controller-manager" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381209 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" containerName="route-controller-manager" Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381222 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a013f059-7440-4f06-88f6-a73f3286d228" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381230 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a013f059-7440-4f06-88f6-a73f3286d228" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381247 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381256 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381269 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="extract-utilities" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381278 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="extract-utilities" Mar 09 16:03:03 crc kubenswrapper[4831]: E0309 16:03:03.381290 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="extract-content" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381297 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="extract-content" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381437 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a013f059-7440-4f06-88f6-a73f3286d228" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381453 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" containerName="route-controller-manager" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381470 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" containerName="oc" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381483 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8980d825-3bbd-4832-807a-d55b47c20e18" containerName="registry-server" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.381930 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.405167 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.463218 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.505449 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config\") pod \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.505610 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktmq7\" (UniqueName: \"kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7\") pod \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.505744 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert\") pod \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.505768 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca\") pod \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\" (UID: \"7f19dba3-76b9-4f0d-9f09-cb63843578d2\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.505996 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.506032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.506063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.506095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnpq8\" (UniqueName: \"kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.506365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "7f19dba3-76b9-4f0d-9f09-cb63843578d2" (UID: "7f19dba3-76b9-4f0d-9f09-cb63843578d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.506382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config" (OuterVolumeSpecName: "config") pod "7f19dba3-76b9-4f0d-9f09-cb63843578d2" (UID: "7f19dba3-76b9-4f0d-9f09-cb63843578d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.511855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7f19dba3-76b9-4f0d-9f09-cb63843578d2" (UID: "7f19dba3-76b9-4f0d-9f09-cb63843578d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.518625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7" (OuterVolumeSpecName: "kube-api-access-ktmq7") pod "7f19dba3-76b9-4f0d-9f09-cb63843578d2" (UID: "7f19dba3-76b9-4f0d-9f09-cb63843578d2"). InnerVolumeSpecName "kube-api-access-ktmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.606998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzsh\" (UniqueName: \"kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh\") pod \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607080 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert\") pod \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607236 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca\") pod \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607298 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config\") pod \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607415 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles\") pod \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\" (UID: \"d11384b3-cbaf-411e-93d2-7faeb4a8579a\") " Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607615 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607668 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607699 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnpq8\" (UniqueName: \"kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607785 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607800 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktmq7\" (UniqueName: \"kubernetes.io/projected/7f19dba3-76b9-4f0d-9f09-cb63843578d2-kube-api-access-ktmq7\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607817 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f19dba3-76b9-4f0d-9f09-cb63843578d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.607828 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f19dba3-76b9-4f0d-9f09-cb63843578d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.608769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d11384b3-cbaf-411e-93d2-7faeb4a8579a" (UID: "d11384b3-cbaf-411e-93d2-7faeb4a8579a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.609249 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.609308 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d11384b3-cbaf-411e-93d2-7faeb4a8579a" (UID: "d11384b3-cbaf-411e-93d2-7faeb4a8579a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.610034 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config" (OuterVolumeSpecName: "config") pod "d11384b3-cbaf-411e-93d2-7faeb4a8579a" (UID: "d11384b3-cbaf-411e-93d2-7faeb4a8579a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.610652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.612779 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d11384b3-cbaf-411e-93d2-7faeb4a8579a" (UID: "d11384b3-cbaf-411e-93d2-7faeb4a8579a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.612855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh" (OuterVolumeSpecName: "kube-api-access-lqzsh") pod "d11384b3-cbaf-411e-93d2-7faeb4a8579a" (UID: "d11384b3-cbaf-411e-93d2-7faeb4a8579a"). InnerVolumeSpecName "kube-api-access-lqzsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.613521 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.639240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnpq8\" (UniqueName: \"kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8\") pod \"route-controller-manager-5dff78cf7b-dsqd7\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.709057 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11384b3-cbaf-411e-93d2-7faeb4a8579a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.709559 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.709574 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.709587 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d11384b3-cbaf-411e-93d2-7faeb4a8579a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.709604 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzsh\" (UniqueName: \"kubernetes.io/projected/d11384b3-cbaf-411e-93d2-7faeb4a8579a-kube-api-access-lqzsh\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.729616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:03 crc kubenswrapper[4831]: I0309 16:03:03.982194 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:03 crc kubenswrapper[4831]: W0309 16:03:03.991266 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7affd259_b338_4ea3_b135_ed314370262c.slice/crio-368b570c9e7b3e297edcb530865d4c7550a6cf82f84259f96e842a1e995057a6 WatchSource:0}: Error finding container 368b570c9e7b3e297edcb530865d4c7550a6cf82f84259f96e842a1e995057a6: Status 404 returned error can't find the container with id 368b570c9e7b3e297edcb530865d4c7550a6cf82f84259f96e842a1e995057a6 Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.308995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" event={"ID":"d11384b3-cbaf-411e-93d2-7faeb4a8579a","Type":"ContainerDied","Data":"4301be18eb4ebe54ae06a26be007405e904fd9cfac0291dc7e6bda85228f6432"} Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.309214 4831 scope.go:117] "RemoveContainer" containerID="35d2ffe63676accf66bbbfc1ef0a3881d0cb21649801cccc4543d2a7586ad198" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.309051 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685f54fcc4-gt2kw" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.313901 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077"} Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.315538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" event={"ID":"7affd259-b338-4ea3-b135-ed314370262c","Type":"ContainerStarted","Data":"a6b400282aa36d561ae5f4615cea864d517d5207356c70af5b05a355ee046452"} Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.315592 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" event={"ID":"7affd259-b338-4ea3-b135-ed314370262c","Type":"ContainerStarted","Data":"368b570c9e7b3e297edcb530865d4c7550a6cf82f84259f96e842a1e995057a6"} Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.315821 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.318083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" event={"ID":"7f19dba3-76b9-4f0d-9f09-cb63843578d2","Type":"ContainerDied","Data":"293a42f05fbeaa26838a9185ce582bf1c326a353ff70ae371783dc055f1c7508"} Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.318163 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.326565 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.330371 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-685f54fcc4-gt2kw"] Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.338062 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.341044 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766d5bff4-jtjpc"] Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.350462 4831 scope.go:117] "RemoveContainer" containerID="02386fcc4d1e1acc7bc81c60fce826ea32e80a69130eda9acc66a2773766abdb" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.359422 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" podStartSLOduration=2.3593939600000002 podStartE2EDuration="2.35939396s" podCreationTimestamp="2026-03-09 16:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:03:04.357773285 +0000 UTC m=+311.491455728" watchObservedRunningTime="2026-03-09 16:03:04.35939396 +0000 UTC m=+311.493076383" Mar 09 16:03:04 crc kubenswrapper[4831]: I0309 16:03:04.659482 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.564867 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.565239 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.627654 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f19dba3-76b9-4f0d-9f09-cb63843578d2" path="/var/lib/kubelet/pods/7f19dba3-76b9-4f0d-9f09-cb63843578d2/volumes" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.628303 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" path="/var/lib/kubelet/pods/d11384b3-cbaf-411e-93d2-7faeb4a8579a/volumes" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.629095 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.760999 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.761043 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.827622 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.931120 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:05 crc kubenswrapper[4831]: E0309 16:03:05.931455 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" containerName="controller-manager" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.931480 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" containerName="controller-manager" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.931601 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11384b3-cbaf-411e-93d2-7faeb4a8579a" containerName="controller-manager" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.932021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.934264 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.934843 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.935180 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.935336 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.935568 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.935813 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.949167 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:03:05 crc kubenswrapper[4831]: I0309 16:03:05.964669 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.005545 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.005646 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.040032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.040122 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.040197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxtg\" (UniqueName: \"kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.040254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.040367 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.048098 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.141175 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.141249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.141285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxtg\" (UniqueName: \"kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.141325 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.141355 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.142292 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.142457 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.144912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.150295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.153221 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.153274 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.160224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxtg\" (UniqueName: \"kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg\") pod \"controller-manager-6946984dd4-rl27n\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.200305 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.249446 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.405904 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.407632 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.410904 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.418673 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:06 crc kubenswrapper[4831]: I0309 16:03:06.662611 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.356680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" event={"ID":"a5fd7ef9-49b7-420a-832e-27dff0ed37f1","Type":"ContainerStarted","Data":"796b22d8e7bd768506831e107dab9bfc13cc73ff17f5b5d54dc84b9f650d37e2"} Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.357484 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.357508 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" event={"ID":"a5fd7ef9-49b7-420a-832e-27dff0ed37f1","Type":"ContainerStarted","Data":"0f4dc967824ca002841327240f96cf8da36e1335ce909200a8717d938f9411ef"} Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.362711 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.377873 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" podStartSLOduration=5.377846715 podStartE2EDuration="5.377846715s" podCreationTimestamp="2026-03-09 16:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:03:07.3772982 +0000 UTC m=+314.510980663" watchObservedRunningTime="2026-03-09 16:03:07.377846715 +0000 UTC m=+314.511529138" Mar 09 16:03:07 crc kubenswrapper[4831]: I0309 16:03:07.497460 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:03:08 crc kubenswrapper[4831]: I0309 16:03:08.095928 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:03:08 crc kubenswrapper[4831]: I0309 16:03:08.362871 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cn4z8" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="registry-server" containerID="cri-o://5e7b3dee6c500e08dd9cdff87e2b25f7025df0d338346566011bad68c06e8ce9" gracePeriod=2 Mar 09 16:03:08 crc kubenswrapper[4831]: I0309 16:03:08.363479 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqxw6" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="registry-server" containerID="cri-o://106b83e181b48b74e554e080a94fb5945d6e88b33a1e018d79139b019c59c264" gracePeriod=2 Mar 09 16:03:08 crc kubenswrapper[4831]: I0309 16:03:08.956805 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.027875 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.160259 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.160972 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.206680 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.376249 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerID="106b83e181b48b74e554e080a94fb5945d6e88b33a1e018d79139b019c59c264" exitCode=0 Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.376347 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerDied","Data":"106b83e181b48b74e554e080a94fb5945d6e88b33a1e018d79139b019c59c264"} Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.378662 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerID="5e7b3dee6c500e08dd9cdff87e2b25f7025df0d338346566011bad68c06e8ce9" exitCode=0 Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.378773 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerDied","Data":"5e7b3dee6c500e08dd9cdff87e2b25f7025df0d338346566011bad68c06e8ce9"} Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.437962 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.536417 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.539508 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content\") pod \"3d0e6234-ab07-440a-8926-925d66e3ba7f\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27cp\" (UniqueName: \"kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp\") pod \"3d0e6234-ab07-440a-8926-925d66e3ba7f\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725743 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities\") pod \"b8ed410a-1efe-4e39-853c-f87a9dc04437\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwmr6\" (UniqueName: \"kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6\") pod \"b8ed410a-1efe-4e39-853c-f87a9dc04437\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities\") pod \"3d0e6234-ab07-440a-8926-925d66e3ba7f\" (UID: \"3d0e6234-ab07-440a-8926-925d66e3ba7f\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.725860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content\") pod \"b8ed410a-1efe-4e39-853c-f87a9dc04437\" (UID: \"b8ed410a-1efe-4e39-853c-f87a9dc04437\") " Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.728611 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities" (OuterVolumeSpecName: "utilities") pod "b8ed410a-1efe-4e39-853c-f87a9dc04437" (UID: "b8ed410a-1efe-4e39-853c-f87a9dc04437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.729160 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities" (OuterVolumeSpecName: "utilities") pod "3d0e6234-ab07-440a-8926-925d66e3ba7f" (UID: "3d0e6234-ab07-440a-8926-925d66e3ba7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.733836 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp" (OuterVolumeSpecName: "kube-api-access-c27cp") pod "3d0e6234-ab07-440a-8926-925d66e3ba7f" (UID: "3d0e6234-ab07-440a-8926-925d66e3ba7f"). InnerVolumeSpecName "kube-api-access-c27cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.734536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6" (OuterVolumeSpecName: "kube-api-access-pwmr6") pod "b8ed410a-1efe-4e39-853c-f87a9dc04437" (UID: "b8ed410a-1efe-4e39-853c-f87a9dc04437"). InnerVolumeSpecName "kube-api-access-pwmr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.787576 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d0e6234-ab07-440a-8926-925d66e3ba7f" (UID: "3d0e6234-ab07-440a-8926-925d66e3ba7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.792485 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8ed410a-1efe-4e39-853c-f87a9dc04437" (UID: "b8ed410a-1efe-4e39-853c-f87a9dc04437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.826997 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.827032 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c27cp\" (UniqueName: \"kubernetes.io/projected/3d0e6234-ab07-440a-8926-925d66e3ba7f-kube-api-access-c27cp\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.827044 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.827052 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwmr6\" (UniqueName: \"kubernetes.io/projected/b8ed410a-1efe-4e39-853c-f87a9dc04437-kube-api-access-pwmr6\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.827062 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0e6234-ab07-440a-8926-925d66e3ba7f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:09 crc kubenswrapper[4831]: I0309 16:03:09.827070 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ed410a-1efe-4e39-853c-f87a9dc04437-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.389683 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn4z8" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.389671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn4z8" event={"ID":"3d0e6234-ab07-440a-8926-925d66e3ba7f","Type":"ContainerDied","Data":"db236957ef910c7fe0e077e0db3133cea89b13d91b169f954b26074258b0a3a2"} Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.390156 4831 scope.go:117] "RemoveContainer" containerID="5e7b3dee6c500e08dd9cdff87e2b25f7025df0d338346566011bad68c06e8ce9" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.398039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqxw6" event={"ID":"b8ed410a-1efe-4e39-853c-f87a9dc04437","Type":"ContainerDied","Data":"41fd6411befd516283cacdc8c66b717a9b070cc6e667c877866b0d6e9443380e"} Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.398326 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqxw6" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.409695 4831 scope.go:117] "RemoveContainer" containerID="af7c6260d816052121f8c27f4c8f8431bc091c7268e5e6d517400d383bbe76db" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.443012 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.447069 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cn4z8"] Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.456802 4831 scope.go:117] "RemoveContainer" containerID="686f1ac64d56b5b906b3f2599e2e4fd33a83525256d31cce6bead4796380af4d" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.457945 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.460813 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqxw6"] Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.476316 4831 scope.go:117] "RemoveContainer" containerID="106b83e181b48b74e554e080a94fb5945d6e88b33a1e018d79139b019c59c264" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.494151 4831 scope.go:117] "RemoveContainer" containerID="3c42e78670ae015904a53fe22b905f570cc44d16297721d3102a9ab581658393" Mar 09 16:03:10 crc kubenswrapper[4831]: I0309 16:03:10.510278 4831 scope.go:117] "RemoveContainer" containerID="8f1234aec641c5970ba89cf2929049c77593ad0a84b1f6dc55895d0dfc28e2a8" Mar 09 16:03:11 crc kubenswrapper[4831]: I0309 16:03:11.632636 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" path="/var/lib/kubelet/pods/3d0e6234-ab07-440a-8926-925d66e3ba7f/volumes" Mar 09 16:03:11 crc kubenswrapper[4831]: I0309 16:03:11.634644 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" path="/var/lib/kubelet/pods/b8ed410a-1efe-4e39-853c-f87a9dc04437/volumes" Mar 09 16:03:12 crc kubenswrapper[4831]: I0309 16:03:12.498542 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:03:12 crc kubenswrapper[4831]: I0309 16:03:12.499360 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hw7pt" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="registry-server" containerID="cri-o://e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8" gracePeriod=2 Mar 09 16:03:12 crc kubenswrapper[4831]: I0309 16:03:12.934723 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.068074 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities\") pod \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.068824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c24jz\" (UniqueName: \"kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz\") pod \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.068988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content\") pod \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\" (UID: \"2325f3d2-538f-4529-ac16-4c7c81cd13e3\") " Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.069005 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities" (OuterVolumeSpecName: "utilities") pod "2325f3d2-538f-4529-ac16-4c7c81cd13e3" (UID: "2325f3d2-538f-4529-ac16-4c7c81cd13e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.069854 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.075689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz" (OuterVolumeSpecName: "kube-api-access-c24jz") pod "2325f3d2-538f-4529-ac16-4c7c81cd13e3" (UID: "2325f3d2-538f-4529-ac16-4c7c81cd13e3"). InnerVolumeSpecName "kube-api-access-c24jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.171231 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c24jz\" (UniqueName: \"kubernetes.io/projected/2325f3d2-538f-4529-ac16-4c7c81cd13e3-kube-api-access-c24jz\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.203032 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2325f3d2-538f-4529-ac16-4c7c81cd13e3" (UID: "2325f3d2-538f-4529-ac16-4c7c81cd13e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.272883 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2325f3d2-538f-4529-ac16-4c7c81cd13e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.419450 4831 generic.go:334] "Generic (PLEG): container finished" podID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerID="e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8" exitCode=0 Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.419502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerDied","Data":"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8"} Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.419532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw7pt" event={"ID":"2325f3d2-538f-4529-ac16-4c7c81cd13e3","Type":"ContainerDied","Data":"20736501e625e988fe4ea51716d10fa98104952db2a6ad815c45246c7a7e9d0a"} Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.419534 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw7pt" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.419555 4831 scope.go:117] "RemoveContainer" containerID="e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.437259 4831 scope.go:117] "RemoveContainer" containerID="041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.463149 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.466570 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hw7pt"] Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.483447 4831 scope.go:117] "RemoveContainer" containerID="a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.503050 4831 scope.go:117] "RemoveContainer" containerID="e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8" Mar 09 16:03:13 crc kubenswrapper[4831]: E0309 16:03:13.503790 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8\": container with ID starting with e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8 not found: ID does not exist" containerID="e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.503844 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8"} err="failed to get container status \"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8\": rpc error: code = NotFound desc = could not find container \"e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8\": container with ID starting with e3f4c55b55145739e21954120d4bc97a11606f4f976185e955b2d568ab699ac8 not found: ID does not exist" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.503874 4831 scope.go:117] "RemoveContainer" containerID="041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258" Mar 09 16:03:13 crc kubenswrapper[4831]: E0309 16:03:13.504365 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258\": container with ID starting with 041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258 not found: ID does not exist" containerID="041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.504453 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258"} err="failed to get container status \"041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258\": rpc error: code = NotFound desc = could not find container \"041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258\": container with ID starting with 041bb6f5b68c6118c601e011fed76dc028128442f9799d87a43e6cd26a23a258 not found: ID does not exist" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.504473 4831 scope.go:117] "RemoveContainer" containerID="a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc" Mar 09 16:03:13 crc kubenswrapper[4831]: E0309 16:03:13.504794 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc\": container with ID starting with a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc not found: ID does not exist" containerID="a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.504833 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc"} err="failed to get container status \"a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc\": rpc error: code = NotFound desc = could not find container \"a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc\": container with ID starting with a30311238f265767fd837a761fe82a90afed738b9d90866fbe3842d191098cdc not found: ID does not exist" Mar 09 16:03:13 crc kubenswrapper[4831]: I0309 16:03:13.627462 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" path="/var/lib/kubelet/pods/2325f3d2-538f-4529-ac16-4c7c81cd13e3/volumes" Mar 09 16:03:16 crc kubenswrapper[4831]: I0309 16:03:16.260000 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2nnxk"] Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.236530 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.237236 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" podUID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" containerName="controller-manager" containerID="cri-o://796b22d8e7bd768506831e107dab9bfc13cc73ff17f5b5d54dc84b9f650d37e2" gracePeriod=30 Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.337818 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.338174 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" podUID="7affd259-b338-4ea3-b135-ed314370262c" containerName="route-controller-manager" containerID="cri-o://a6b400282aa36d561ae5f4615cea864d517d5207356c70af5b05a355ee046452" gracePeriod=30 Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.472824 4831 generic.go:334] "Generic (PLEG): container finished" podID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" containerID="796b22d8e7bd768506831e107dab9bfc13cc73ff17f5b5d54dc84b9f650d37e2" exitCode=0 Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.473032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" event={"ID":"a5fd7ef9-49b7-420a-832e-27dff0ed37f1","Type":"ContainerDied","Data":"796b22d8e7bd768506831e107dab9bfc13cc73ff17f5b5d54dc84b9f650d37e2"} Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.476517 4831 generic.go:334] "Generic (PLEG): container finished" podID="7affd259-b338-4ea3-b135-ed314370262c" containerID="a6b400282aa36d561ae5f4615cea864d517d5207356c70af5b05a355ee046452" exitCode=0 Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.476568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" event={"ID":"7affd259-b338-4ea3-b135-ed314370262c","Type":"ContainerDied","Data":"a6b400282aa36d561ae5f4615cea864d517d5207356c70af5b05a355ee046452"} Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.814271 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.818616 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.918184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config\") pod \"7affd259-b338-4ea3-b135-ed314370262c\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.918243 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnpq8\" (UniqueName: \"kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8\") pod \"7affd259-b338-4ea3-b135-ed314370262c\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.918300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert\") pod \"7affd259-b338-4ea3-b135-ed314370262c\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.918379 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca\") pod \"7affd259-b338-4ea3-b135-ed314370262c\" (UID: \"7affd259-b338-4ea3-b135-ed314370262c\") " Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.919155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca" (OuterVolumeSpecName: "client-ca") pod "7affd259-b338-4ea3-b135-ed314370262c" (UID: "7affd259-b338-4ea3-b135-ed314370262c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.919189 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config" (OuterVolumeSpecName: "config") pod "7affd259-b338-4ea3-b135-ed314370262c" (UID: "7affd259-b338-4ea3-b135-ed314370262c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.923541 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7affd259-b338-4ea3-b135-ed314370262c" (UID: "7affd259-b338-4ea3-b135-ed314370262c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:22 crc kubenswrapper[4831]: I0309 16:03:22.923754 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8" (OuterVolumeSpecName: "kube-api-access-qnpq8") pod "7affd259-b338-4ea3-b135-ed314370262c" (UID: "7affd259-b338-4ea3-b135-ed314370262c"). InnerVolumeSpecName "kube-api-access-qnpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wxtg\" (UniqueName: \"kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg\") pod \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019620 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles\") pod \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config\") pod \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019730 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca\") pod \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert\") pod \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\" (UID: \"a5fd7ef9-49b7-420a-832e-27dff0ed37f1\") " Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019936 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019952 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnpq8\" (UniqueName: \"kubernetes.io/projected/7affd259-b338-4ea3-b135-ed314370262c-kube-api-access-qnpq8\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019965 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7affd259-b338-4ea3-b135-ed314370262c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.019979 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7affd259-b338-4ea3-b135-ed314370262c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.020742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a5fd7ef9-49b7-420a-832e-27dff0ed37f1" (UID: "a5fd7ef9-49b7-420a-832e-27dff0ed37f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.020784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config" (OuterVolumeSpecName: "config") pod "a5fd7ef9-49b7-420a-832e-27dff0ed37f1" (UID: "a5fd7ef9-49b7-420a-832e-27dff0ed37f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.020765 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a5fd7ef9-49b7-420a-832e-27dff0ed37f1" (UID: "a5fd7ef9-49b7-420a-832e-27dff0ed37f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.022590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a5fd7ef9-49b7-420a-832e-27dff0ed37f1" (UID: "a5fd7ef9-49b7-420a-832e-27dff0ed37f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.023472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg" (OuterVolumeSpecName: "kube-api-access-6wxtg") pod "a5fd7ef9-49b7-420a-832e-27dff0ed37f1" (UID: "a5fd7ef9-49b7-420a-832e-27dff0ed37f1"). InnerVolumeSpecName "kube-api-access-6wxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.121856 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wxtg\" (UniqueName: \"kubernetes.io/projected/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-kube-api-access-6wxtg\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.121905 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.121918 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.121933 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.121942 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fd7ef9-49b7-420a-832e-27dff0ed37f1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.485631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" event={"ID":"a5fd7ef9-49b7-420a-832e-27dff0ed37f1","Type":"ContainerDied","Data":"0f4dc967824ca002841327240f96cf8da36e1335ce909200a8717d938f9411ef"} Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.485690 4831 scope.go:117] "RemoveContainer" containerID="796b22d8e7bd768506831e107dab9bfc13cc73ff17f5b5d54dc84b9f650d37e2" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.485794 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6946984dd4-rl27n" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.490879 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" event={"ID":"7affd259-b338-4ea3-b135-ed314370262c","Type":"ContainerDied","Data":"368b570c9e7b3e297edcb530865d4c7550a6cf82f84259f96e842a1e995057a6"} Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.490977 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.521354 4831 scope.go:117] "RemoveContainer" containerID="a6b400282aa36d561ae5f4615cea864d517d5207356c70af5b05a355ee046452" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.533188 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.538267 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6946984dd4-rl27n"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.549768 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.552285 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dff78cf7b-dsqd7"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.628338 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7affd259-b338-4ea3-b135-ed314370262c" path="/var/lib/kubelet/pods/7affd259-b338-4ea3-b135-ed314370262c/volumes" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.629143 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" path="/var/lib/kubelet/pods/a5fd7ef9-49b7-420a-832e-27dff0ed37f1/volumes" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.945913 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8"] Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946339 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946367 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946392 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946446 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946486 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946506 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946529 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7affd259-b338-4ea3-b135-ed314370262c" containerName="route-controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946545 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7affd259-b338-4ea3-b135-ed314370262c" containerName="route-controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946571 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946590 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946619 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946637 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946666 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946683 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946706 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946720 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946734 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946748 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="extract-utilities" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946774 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" containerName="controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946791 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" containerName="controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: E0309 16:03:23.946819 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.946875 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="extract-content" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.947069 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7affd259-b338-4ea3-b135-ed314370262c" containerName="route-controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.947088 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0e6234-ab07-440a-8926-925d66e3ba7f" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.947112 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2325f3d2-538f-4529-ac16-4c7c81cd13e3" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.947138 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fd7ef9-49b7-420a-832e-27dff0ed37f1" containerName="controller-manager" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.947155 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ed410a-1efe-4e39-853c-f87a9dc04437" containerName="registry-server" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.948003 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.953534 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f9fdf949f-bkdcc"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.955283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.956566 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.956754 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.956995 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.957162 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.959104 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.959634 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.959945 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.960016 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.960204 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.960312 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.960384 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.960473 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.964680 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.968183 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f9fdf949f-bkdcc"] Mar 09 16:03:23 crc kubenswrapper[4831]: I0309 16:03:23.975324 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136096 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-client-ca\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136186 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc8k\" (UniqueName: \"kubernetes.io/projected/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-kube-api-access-4kc8k\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-config\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136366 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-serving-cert\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136760 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-proxy-ca-bundles\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.136860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjsb\" (UniqueName: \"kubernetes.io/projected/014d145b-93b3-44ed-a338-4e787e53927f-kube-api-access-9gjsb\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.137320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-client-ca\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.137443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/014d145b-93b3-44ed-a338-4e787e53927f-serving-cert\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.138203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-config\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239586 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-client-ca\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-config\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-serving-cert\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc8k\" (UniqueName: \"kubernetes.io/projected/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-kube-api-access-4kc8k\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239731 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-proxy-ca-bundles\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjsb\" (UniqueName: \"kubernetes.io/projected/014d145b-93b3-44ed-a338-4e787e53927f-kube-api-access-9gjsb\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-client-ca\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.239882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/014d145b-93b3-44ed-a338-4e787e53927f-serving-cert\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.242071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-client-ca\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.242460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-config\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.242849 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-client-ca\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.243126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d145b-93b3-44ed-a338-4e787e53927f-config\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.243901 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-proxy-ca-bundles\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.248303 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-config\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.250492 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/014d145b-93b3-44ed-a338-4e787e53927f-serving-cert\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.250574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-serving-cert\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.261945 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjsb\" (UniqueName: \"kubernetes.io/projected/014d145b-93b3-44ed-a338-4e787e53927f-kube-api-access-9gjsb\") pod \"route-controller-manager-c9fbbdd7d-xlqk8\" (UID: \"014d145b-93b3-44ed-a338-4e787e53927f\") " pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.263975 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc8k\" (UniqueName: \"kubernetes.io/projected/8a971c10-fe7b-4be6-b847-ccaa4b4b5314-kube-api-access-4kc8k\") pod \"controller-manager-f9fdf949f-bkdcc\" (UID: \"8a971c10-fe7b-4be6-b847-ccaa4b4b5314\") " pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.302542 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.309439 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.596241 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f9fdf949f-bkdcc"] Mar 09 16:03:24 crc kubenswrapper[4831]: W0309 16:03:24.606612 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a971c10_fe7b_4be6_b847_ccaa4b4b5314.slice/crio-6eb29117956e8bd1e5406df2039053c139f8a72c7e3766158c341a6f1731800e WatchSource:0}: Error finding container 6eb29117956e8bd1e5406df2039053c139f8a72c7e3766158c341a6f1731800e: Status 404 returned error can't find the container with id 6eb29117956e8bd1e5406df2039053c139f8a72c7e3766158c341a6f1731800e Mar 09 16:03:24 crc kubenswrapper[4831]: I0309 16:03:24.732312 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8"] Mar 09 16:03:24 crc kubenswrapper[4831]: W0309 16:03:24.735605 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014d145b_93b3_44ed_a338_4e787e53927f.slice/crio-645061f13e049afde6bd9873a607417892453b5f07e86587cbfbd9d68e05ce0d WatchSource:0}: Error finding container 645061f13e049afde6bd9873a607417892453b5f07e86587cbfbd9d68e05ce0d: Status 404 returned error can't find the container with id 645061f13e049afde6bd9873a607417892453b5f07e86587cbfbd9d68e05ce0d Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.508649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" event={"ID":"014d145b-93b3-44ed-a338-4e787e53927f","Type":"ContainerStarted","Data":"f65f0e2fa7c1206e9fac78e9c818e93bb91dbf6063885fd10c8d58df0ffefe62"} Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.508735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" event={"ID":"014d145b-93b3-44ed-a338-4e787e53927f","Type":"ContainerStarted","Data":"645061f13e049afde6bd9873a607417892453b5f07e86587cbfbd9d68e05ce0d"} Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.509921 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.511802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" event={"ID":"8a971c10-fe7b-4be6-b847-ccaa4b4b5314","Type":"ContainerStarted","Data":"906f66684958dfc748be1f09037aa43c16be7cb456b168fc634e98e08b6a788a"} Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.511839 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" event={"ID":"8a971c10-fe7b-4be6-b847-ccaa4b4b5314","Type":"ContainerStarted","Data":"6eb29117956e8bd1e5406df2039053c139f8a72c7e3766158c341a6f1731800e"} Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.512336 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.515573 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.516889 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" Mar 09 16:03:25 crc kubenswrapper[4831]: I0309 16:03:25.532427 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c9fbbdd7d-xlqk8" podStartSLOduration=3.53240784 podStartE2EDuration="3.53240784s" podCreationTimestamp="2026-03-09 16:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:03:25.531268468 +0000 UTC m=+332.664950891" watchObservedRunningTime="2026-03-09 16:03:25.53240784 +0000 UTC m=+332.666090253" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.187224 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.187882 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188107 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05" gracePeriod=15 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188152 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188160 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee" gracePeriod=15 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188208 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d" gracePeriod=15 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188243 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667" gracePeriod=15 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.188274 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1" gracePeriod=15 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.190672 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.190905 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.190935 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.190950 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.190962 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.190983 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.190995 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191007 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191018 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191033 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191043 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191059 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191070 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191083 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191093 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191106 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191116 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191127 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191137 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191152 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191163 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191300 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191316 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191331 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191344 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191356 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191367 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191379 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191414 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.191622 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191639 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191791 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.191810 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.266480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.269842 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.269894 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.270109 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.270182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.270222 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.270251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.270274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378322 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378388 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378458 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378486 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378526 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378523 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378561 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378595 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378731 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.378780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.520875 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.522544 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.524816 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee" exitCode=0 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.524851 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d" exitCode=0 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.524860 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667" exitCode=0 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.524871 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1" exitCode=2 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.524941 4831 scope.go:117] "RemoveContainer" containerID="6d3d7955a73aa5f30d2a1eecd1b8036781a590d8a50b378076cab69babf7723d" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.528219 4831 generic.go:334] "Generic (PLEG): container finished" podID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" containerID="a2a99e375e224dd2c4b1bccf69cf2d00472d6e1564d79f2dc720b4d77ed92f02" exitCode=0 Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.528676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb770b7c-9f0b-4e5e-a912-f34519e56e13","Type":"ContainerDied","Data":"a2a99e375e224dd2c4b1bccf69cf2d00472d6e1564d79f2dc720b4d77ed92f02"} Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.530224 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.531124 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.637131 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.638363 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.639142 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.639437 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.639695 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.639726 4831 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.639993 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms" Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.670214 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 09 16:03:26 crc kubenswrapper[4831]: I0309 16:03:26.670272 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 09 16:03:26 crc kubenswrapper[4831]: E0309 16:03:26.841097 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms" Mar 09 16:03:27 crc kubenswrapper[4831]: E0309 16:03:27.242829 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.539785 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.855342 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.856036 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903051 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock\") pod \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903195 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access\") pod \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock" (OuterVolumeSpecName: "var-lock") pod "cb770b7c-9f0b-4e5e-a912-f34519e56e13" (UID: "cb770b7c-9f0b-4e5e-a912-f34519e56e13"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir\") pod \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\" (UID: \"cb770b7c-9f0b-4e5e-a912-f34519e56e13\") " Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903416 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cb770b7c-9f0b-4e5e-a912-f34519e56e13" (UID: "cb770b7c-9f0b-4e5e-a912-f34519e56e13"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903760 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.903785 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:27 crc kubenswrapper[4831]: I0309 16:03:27.910630 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cb770b7c-9f0b-4e5e-a912-f34519e56e13" (UID: "cb770b7c-9f0b-4e5e-a912-f34519e56e13"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.005046 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb770b7c-9f0b-4e5e-a912-f34519e56e13-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:28 crc kubenswrapper[4831]: E0309 16:03:28.044050 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.547872 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb770b7c-9f0b-4e5e-a912-f34519e56e13","Type":"ContainerDied","Data":"e67a77d8df618ea70c7921b4949ea474e4424c993235c21b10ba132ae6aa34dd"} Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.547919 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e67a77d8df618ea70c7921b4949ea474e4424c993235c21b10ba132ae6aa34dd" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.548015 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.627434 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.656194 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.658729 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.659153 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.659313 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822500 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822577 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822609 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822659 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822796 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.822847 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.823133 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.823215 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:28 crc kubenswrapper[4831]: I0309 16:03:28.823256 4831 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.557070 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.557873 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05" exitCode=0 Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.557935 4831 scope.go:117] "RemoveContainer" containerID="75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.557965 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.572814 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.573265 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.578003 4831 scope.go:117] "RemoveContainer" containerID="319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.599526 4831 scope.go:117] "RemoveContainer" containerID="f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.621029 4831 scope.go:117] "RemoveContainer" containerID="73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.625703 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.637044 4831 scope.go:117] "RemoveContainer" containerID="f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.645621 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.661647 4831 scope.go:117] "RemoveContainer" containerID="f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.694477 4831 scope.go:117] "RemoveContainer" containerID="75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.695141 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee\": container with ID starting with 75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee not found: ID does not exist" containerID="75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.695207 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee"} err="failed to get container status \"75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee\": rpc error: code = NotFound desc = could not find container \"75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee\": container with ID starting with 75f0e85199369e9992d08642810f5ae99a3580a8452fab0bc050ab84a209f2ee not found: ID does not exist" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.695242 4831 scope.go:117] "RemoveContainer" containerID="319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.695875 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\": container with ID starting with 319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d not found: ID does not exist" containerID="319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.695936 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d"} err="failed to get container status \"319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\": rpc error: code = NotFound desc = could not find container \"319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d\": container with ID starting with 319ad157b61aeb4162436252542f11bb3d0ec8f917a4d979e54c17d1a29cb72d not found: ID does not exist" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.695979 4831 scope.go:117] "RemoveContainer" containerID="f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.697132 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\": container with ID starting with f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667 not found: ID does not exist" containerID="f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.697214 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667"} err="failed to get container status \"f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\": rpc error: code = NotFound desc = could not find container \"f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667\": container with ID starting with f08425d6b91ac903aa588fbd8c7d6f49ba45aac9c897376430547dee5841d667 not found: ID does not exist" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.697237 4831 scope.go:117] "RemoveContainer" containerID="73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.697717 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\": container with ID starting with 73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1 not found: ID does not exist" containerID="73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.697747 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1"} err="failed to get container status \"73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\": rpc error: code = NotFound desc = could not find container \"73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1\": container with ID starting with 73f45c10fa24d4f84fe5f9062e050ec589bbabf28cc24a5ba537fedfd890a5a1 not found: ID does not exist" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.697769 4831 scope.go:117] "RemoveContainer" containerID="f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.698015 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\": container with ID starting with f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05 not found: ID does not exist" containerID="f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.698042 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05"} err="failed to get container status \"f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\": rpc error: code = NotFound desc = could not find container \"f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05\": container with ID starting with f6508428e0d6bbac63a17b9fbd84a6418fbc5cef1874b76f9f5f49f6e9677b05 not found: ID does not exist" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.698059 4831 scope.go:117] "RemoveContainer" containerID="f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5" Mar 09 16:03:29 crc kubenswrapper[4831]: E0309 16:03:29.698327 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\": container with ID starting with f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5 not found: ID does not exist" containerID="f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5" Mar 09 16:03:29 crc kubenswrapper[4831]: I0309 16:03:29.698361 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5"} err="failed to get container status \"f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\": rpc error: code = NotFound desc = could not find container \"f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5\": container with ID starting with f64c9836f6073e8dcd0a5268b8ae94ba8b2943d13360cff4d8a1d9f89c8dbbc5 not found: ID does not exist" Mar 09 16:03:31 crc kubenswrapper[4831]: E0309 16:03:31.225710 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:31 crc kubenswrapper[4831]: I0309 16:03:31.226389 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:31 crc kubenswrapper[4831]: E0309 16:03:31.256828 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b37c83106e9a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 16:03:31.255945639 +0000 UTC m=+338.389628062,LastTimestamp:2026-03-09 16:03:31.255945639 +0000 UTC m=+338.389628062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 16:03:31 crc kubenswrapper[4831]: I0309 16:03:31.578812 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fc815dd1279d31cb6205f9013527dec6eef5af351fb9d33ab759b0591d0f0fb0"} Mar 09 16:03:31 crc kubenswrapper[4831]: I0309 16:03:31.578881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"22c1c0abcef38f131aebe6d3ddeae5e8bcae4779e289217f0333cbc826bd0c7c"} Mar 09 16:03:31 crc kubenswrapper[4831]: I0309 16:03:31.579725 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:31 crc kubenswrapper[4831]: E0309 16:03:31.579751 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:03:32 crc kubenswrapper[4831]: E0309 16:03:32.847347 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="6.4s" Mar 09 16:03:33 crc kubenswrapper[4831]: I0309 16:03:33.620425 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:38 crc kubenswrapper[4831]: E0309 16:03:38.365959 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b37c83106e9a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 16:03:31.255945639 +0000 UTC m=+338.389628062,LastTimestamp:2026-03-09 16:03:31.255945639 +0000 UTC m=+338.389628062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 16:03:38 crc kubenswrapper[4831]: I0309 16:03:38.616931 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:38 crc kubenswrapper[4831]: I0309 16:03:38.618316 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:38 crc kubenswrapper[4831]: I0309 16:03:38.633550 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:38 crc kubenswrapper[4831]: I0309 16:03:38.633595 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:38 crc kubenswrapper[4831]: E0309 16:03:38.634145 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:38 crc kubenswrapper[4831]: I0309 16:03:38.634906 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:39 crc kubenswrapper[4831]: E0309 16:03:39.249309 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="7s" Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.638028 4831 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1c46cbecce3c72a89ff48151f3d9365fdbd0dd58661e89484f95a2803060f245" exitCode=0 Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.638107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1c46cbecce3c72a89ff48151f3d9365fdbd0dd58661e89484f95a2803060f245"} Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.638150 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d9d422d2d7d7837cb9d257e451c7fd4e6ace96588ee25ce8d59fc4e106c4a1d"} Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.638479 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.638495 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:39 crc kubenswrapper[4831]: E0309 16:03:39.639127 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:39 crc kubenswrapper[4831]: I0309 16:03:39.639296 4831 status_manager.go:851] "Failed to get status for pod" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" Mar 09 16:03:40 crc kubenswrapper[4831]: I0309 16:03:40.646077 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"56571d91d6ef8a50448b614b2a91a3a96eb8e926212433f3340ec2324ae0425a"} Mar 09 16:03:40 crc kubenswrapper[4831]: I0309 16:03:40.646427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8967b7feedabd2d47474abd7886eca5d9f4e463be3adfd833260c203e85d64a1"} Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.298209 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" podUID="62d1a4be-a162-466f-b579-247a86379faa" containerName="oauth-openshift" containerID="cri-o://0c4ff45242da52b6df31ac2222e1c7d28ea1abcf97edf96d37c8284648f611cf" gracePeriod=15 Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.657173 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d429940d83771aee266b6fbc9dde3340fa63fd60b0920696473a89744beec2a8"} Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.657670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aef875988ae29e2f3c3ca77c9089bd7c0db114020034bc780f226e7af4e98d24"} Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.660508 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.661727 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.661779 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f" exitCode=1 Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.661844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f"} Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.662428 4831 scope.go:117] "RemoveContainer" containerID="8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.666230 4831 generic.go:334] "Generic (PLEG): container finished" podID="62d1a4be-a162-466f-b579-247a86379faa" containerID="0c4ff45242da52b6df31ac2222e1c7d28ea1abcf97edf96d37c8284648f611cf" exitCode=0 Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.666283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" event={"ID":"62d1a4be-a162-466f-b579-247a86379faa","Type":"ContainerDied","Data":"0c4ff45242da52b6df31ac2222e1c7d28ea1abcf97edf96d37c8284648f611cf"} Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.824444 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910828 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr8bw\" (UniqueName: \"kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910940 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.910984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.911016 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.911044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.911072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.911098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login\") pod \"62d1a4be-a162-466f-b579-247a86379faa\" (UID: \"62d1a4be-a162-466f-b579-247a86379faa\") " Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.911826 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.912025 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.912212 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.918902 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.919591 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.919913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.920317 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.923898 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.924173 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw" (OuterVolumeSpecName: "kube-api-access-pr8bw") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "kube-api-access-pr8bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.927574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.928877 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.929362 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.931153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:41 crc kubenswrapper[4831]: I0309 16:03:41.932367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "62d1a4be-a162-466f-b579-247a86379faa" (UID: "62d1a4be-a162-466f-b579-247a86379faa"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.012623 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013063 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013213 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013300 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013426 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013561 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013654 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013757 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013850 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.013993 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.014120 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.014229 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62d1a4be-a162-466f-b579-247a86379faa-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.014337 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62d1a4be-a162-466f-b579-247a86379faa-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.014480 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr8bw\" (UniqueName: \"kubernetes.io/projected/62d1a4be-a162-466f-b579-247a86379faa-kube-api-access-pr8bw\") on node \"crc\" DevicePath \"\"" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.679288 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93505d06b6a6f009716954d82557f59d820c33d26cb24621432d6157373a00cf"} Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.679628 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.679788 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.679822 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.682789 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.684243 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.684376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ccc71d149c402ba43d55d894f4d3f6dfd20fce7cb49269b0f41c6220fc6f5afb"} Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.687739 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" event={"ID":"62d1a4be-a162-466f-b579-247a86379faa","Type":"ContainerDied","Data":"e0ab32144cb9d58a423422329e692e814c5dca299b4c7a1fc34065289c41d75f"} Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.687862 4831 scope.go:117] "RemoveContainer" containerID="0c4ff45242da52b6df31ac2222e1c7d28ea1abcf97edf96d37c8284648f611cf" Mar 09 16:03:42 crc kubenswrapper[4831]: I0309 16:03:42.688062 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2nnxk" Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.246114 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.246734 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.246857 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.635825 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.635876 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:43 crc kubenswrapper[4831]: I0309 16:03:43.642905 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.690926 4831 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.718968 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.722558 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.722593 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.727341 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:03:47 crc kubenswrapper[4831]: I0309 16:03:47.733443 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6ae28171-df3f-454e-888c-8abe585b80b3" Mar 09 16:03:48 crc kubenswrapper[4831]: I0309 16:03:48.729313 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:48 crc kubenswrapper[4831]: I0309 16:03:48.729341 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:03:53 crc kubenswrapper[4831]: I0309 16:03:53.246874 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 16:03:53 crc kubenswrapper[4831]: I0309 16:03:53.247518 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 16:03:53 crc kubenswrapper[4831]: I0309 16:03:53.655425 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6ae28171-df3f-454e-888c-8abe585b80b3" Mar 09 16:03:56 crc kubenswrapper[4831]: I0309 16:03:56.537303 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 16:03:57 crc kubenswrapper[4831]: I0309 16:03:57.768027 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 16:03:58 crc kubenswrapper[4831]: I0309 16:03:58.359886 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 16:03:58 crc kubenswrapper[4831]: I0309 16:03:58.506697 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 16:03:58 crc kubenswrapper[4831]: I0309 16:03:58.761630 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 16:03:58 crc kubenswrapper[4831]: I0309 16:03:58.844832 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 16:03:58 crc kubenswrapper[4831]: I0309 16:03:58.997875 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 16:03:59 crc kubenswrapper[4831]: I0309 16:03:59.217364 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 16:03:59 crc kubenswrapper[4831]: I0309 16:03:59.716424 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 16:03:59 crc kubenswrapper[4831]: I0309 16:03:59.743073 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 16:03:59 crc kubenswrapper[4831]: I0309 16:03:59.993822 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.124994 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.213652 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.374335 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.401488 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.407994 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.504536 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.625605 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.658688 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.667818 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.716033 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.721075 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.775752 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.783564 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.789566 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.809334 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.841418 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.868259 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.878617 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.936049 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 16:04:00 crc kubenswrapper[4831]: I0309 16:04:00.985765 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.065345 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.124161 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.134426 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.215858 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.269899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.275045 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.332939 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.501892 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.718361 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.776294 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.805642 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.824713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.899429 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.902503 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 16:04:01 crc kubenswrapper[4831]: I0309 16:04:01.958966 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.021285 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.029895 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.064134 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.086529 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.251110 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.307458 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.312654 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.365267 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.392988 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.510722 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.542895 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.679783 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.716392 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.775277 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.825540 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.875079 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 16:04:02 crc kubenswrapper[4831]: I0309 16:04:02.906344 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.083745 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.182702 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.246710 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.246801 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.246877 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.247785 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ccc71d149c402ba43d55d894f4d3f6dfd20fce7cb49269b0f41c6220fc6f5afb"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.247984 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ccc71d149c402ba43d55d894f4d3f6dfd20fce7cb49269b0f41c6220fc6f5afb" gracePeriod=30 Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.366816 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.407654 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.566059 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.576549 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.583773 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.691264 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.705132 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.784950 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.922652 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.958068 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.995996 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 16:04:03 crc kubenswrapper[4831]: I0309 16:04:03.996169 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.034777 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.059597 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.060994 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.123454 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.176298 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.195686 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.211766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.267190 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.278743 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.351304 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.352894 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.400218 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.433190 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.439227 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.513581 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.537219 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.669986 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.726866 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.748865 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.765725 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.804637 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.823657 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.863998 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.883176 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.921498 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.929440 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 16:04:04 crc kubenswrapper[4831]: I0309 16:04:04.973536 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.062943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.086642 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.142622 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.175121 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.263060 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.265731 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.316262 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.488100 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.488834 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.503334 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.513271 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.633757 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.634024 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.644665 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.675965 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.773748 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.836985 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.837293 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.838745 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.878081 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.950734 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.953339 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.959453 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 16:04:05 crc kubenswrapper[4831]: I0309 16:04:05.966824 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.110386 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.149815 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.165505 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.185579 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.253384 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.275448 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.303175 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.419044 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.453363 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.454318 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.474910 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.488500 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.496659 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.550042 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.666885 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.672258 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.697107 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.849770 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 16:04:06 crc kubenswrapper[4831]: I0309 16:04:06.988706 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.109875 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.309612 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.358898 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.396257 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.442572 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.461591 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.582715 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.644928 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.655970 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.668872 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.758490 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.788552 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.836879 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.843637 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.859644 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.907878 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.908528 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.919882 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.949650 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.963149 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 16:04:07 crc kubenswrapper[4831]: I0309 16:04:07.964264 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.016044 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.167365 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.241563 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.288209 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.315988 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.527034 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.595663 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.659587 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.688503 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.690238 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.693215 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.705613 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.706878 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f9fdf949f-bkdcc" podStartSLOduration=46.706852129 podStartE2EDuration="46.706852129s" podCreationTimestamp="2026-03-09 16:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:03:25.589279686 +0000 UTC m=+332.722962119" watchObservedRunningTime="2026-03-09 16:04:08.706852129 +0000 UTC m=+375.840534562" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.711564 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2nnxk","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.711638 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.711974 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.712002 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="241ae37c-298d-408e-85c3-b88a569b096c" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.715650 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.734722 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.734704751 podStartE2EDuration="21.734704751s" podCreationTimestamp="2026-03-09 16:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:04:08.732158409 +0000 UTC m=+375.865840842" watchObservedRunningTime="2026-03-09 16:04:08.734704751 +0000 UTC m=+375.868387174" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.748105 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.764039 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.920697 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.938538 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 16:04:08 crc kubenswrapper[4831]: I0309 16:04:08.983484 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.074266 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.088486 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.101632 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.268362 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.352971 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.357670 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.450797 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.559052 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.626815 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d1a4be-a162-466f-b579-247a86379faa" path="/var/lib/kubelet/pods/62d1a4be-a162-466f-b579-247a86379faa/volumes" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.636253 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.659770 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.778515 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.803759 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.811844 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.812050 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fc815dd1279d31cb6205f9013527dec6eef5af351fb9d33ab759b0591d0f0fb0" gracePeriod=5 Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.817528 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.929485 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.932190 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.989864 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 16:04:09 crc kubenswrapper[4831]: I0309 16:04:09.990061 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.053531 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.123654 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.130582 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.359572 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.449256 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.467831 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.596847 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.770353 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.839698 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 16:04:10 crc kubenswrapper[4831]: I0309 16:04:10.936442 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.286866 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.326396 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.329006 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.458379 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.606715 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.675156 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.781529 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 16:04:11 crc kubenswrapper[4831]: I0309 16:04:11.805390 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.119893 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.158894 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.169139 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.390978 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.564292 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.572834 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.798825 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 16:04:12 crc kubenswrapper[4831]: I0309 16:04:12.851282 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.047270 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.326784 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.361449 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.444356 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.785914 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 16:04:13 crc kubenswrapper[4831]: I0309 16:04:13.826031 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 16:04:14 crc kubenswrapper[4831]: I0309 16:04:14.022636 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 16:04:14 crc kubenswrapper[4831]: I0309 16:04:14.041008 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 16:04:14 crc kubenswrapper[4831]: I0309 16:04:14.883167 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 16:04:14 crc kubenswrapper[4831]: I0309 16:04:14.883219 4831 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fc815dd1279d31cb6205f9013527dec6eef5af351fb9d33ab759b0591d0f0fb0" exitCode=137 Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.396756 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.396827 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.410103 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.518963 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519231 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519337 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.519771 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.520096 4831 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.520545 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.520576 4831 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.520600 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.529742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.621797 4831 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.625851 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.891131 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.891216 4831 scope.go:117] "RemoveContainer" containerID="fc815dd1279d31cb6205f9013527dec6eef5af351fb9d33ab759b0591d0f0fb0" Mar 09 16:04:15 crc kubenswrapper[4831]: I0309 16:04:15.891294 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126056 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d49c9496c-8w4dv"] Mar 09 16:04:16 crc kubenswrapper[4831]: E0309 16:04:16.126363 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" containerName="installer" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126379 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" containerName="installer" Mar 09 16:04:16 crc kubenswrapper[4831]: E0309 16:04:16.126392 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d1a4be-a162-466f-b579-247a86379faa" containerName="oauth-openshift" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126456 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d1a4be-a162-466f-b579-247a86379faa" containerName="oauth-openshift" Mar 09 16:04:16 crc kubenswrapper[4831]: E0309 16:04:16.126472 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126478 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126604 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126623 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb770b7c-9f0b-4e5e-a912-f34519e56e13" containerName="installer" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.126633 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d1a4be-a162-466f-b579-247a86379faa" containerName="oauth-openshift" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.127148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.130922 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.130977 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.131087 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.131935 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.132967 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.133176 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.135555 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.135690 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.136774 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.139323 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.143939 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d49c9496c-8w4dv"] Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.144342 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.144613 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.144917 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.150893 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.157210 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.228880 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.228923 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-dir\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.228967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.228990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcgd\" (UniqueName: \"kubernetes.io/projected/e614ee87-817d-4504-ad7a-ade28e31acd1-kube-api-access-pzcgd\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229071 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229091 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229122 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229150 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-login\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-session\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-policies\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.229214 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-error\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330512 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-login\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-session\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-policies\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-error\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.330904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331060 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-dir\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331621 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcgd\" (UniqueName: \"kubernetes.io/projected/e614ee87-817d-4504-ad7a-ade28e31acd1-kube-api-access-pzcgd\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.332091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.332156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-policies\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.332370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.331194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e614ee87-817d-4504-ad7a-ade28e31acd1-audit-dir\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.332812 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.334321 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.334452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.335087 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.335441 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-session\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.335511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.337775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-error\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.338815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-login\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.338955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.339831 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.350219 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcgd\" (UniqueName: \"kubernetes.io/projected/e614ee87-817d-4504-ad7a-ade28e31acd1-kube-api-access-pzcgd\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.351657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e614ee87-817d-4504-ad7a-ade28e31acd1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d49c9496c-8w4dv\" (UID: \"e614ee87-817d-4504-ad7a-ade28e31acd1\") " pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.447791 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.881869 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d49c9496c-8w4dv"] Mar 09 16:04:16 crc kubenswrapper[4831]: I0309 16:04:16.903982 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" event={"ID":"e614ee87-817d-4504-ad7a-ade28e31acd1","Type":"ContainerStarted","Data":"65fa934cff1afb0caceecf83bb97512afc1b950e330f92e7de3b0b4b9167b9d4"} Mar 09 16:04:17 crc kubenswrapper[4831]: I0309 16:04:17.912611 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" event={"ID":"e614ee87-817d-4504-ad7a-ade28e31acd1","Type":"ContainerStarted","Data":"d4b8fb461bb58b9be1711f9e95e83cbcc3c2725a55fab41dd869e417e9cbb74b"} Mar 09 16:04:17 crc kubenswrapper[4831]: I0309 16:04:17.913087 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:17 crc kubenswrapper[4831]: I0309 16:04:17.921374 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" Mar 09 16:04:17 crc kubenswrapper[4831]: I0309 16:04:17.938823 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d49c9496c-8w4dv" podStartSLOduration=61.938792849 podStartE2EDuration="1m1.938792849s" podCreationTimestamp="2026-03-09 16:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:04:17.935049314 +0000 UTC m=+385.068731777" watchObservedRunningTime="2026-03-09 16:04:17.938792849 +0000 UTC m=+385.072475272" Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.026639 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.030031 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.031880 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.031967 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ccc71d149c402ba43d55d894f4d3f6dfd20fce7cb49269b0f41c6220fc6f5afb" exitCode=137 Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.032037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ccc71d149c402ba43d55d894f4d3f6dfd20fce7cb49269b0f41c6220fc6f5afb"} Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.032101 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ea16681a1bf025592de86a200f42ff34e5847141bf4c32b9e5ac541bfc349f7"} Mar 09 16:04:34 crc kubenswrapper[4831]: I0309 16:04:34.032129 4831 scope.go:117] "RemoveContainer" containerID="8c89a337e586b29e5c640f1fc4688a4aae1a3b814a80507ffd80a38e9419652f" Mar 09 16:04:35 crc kubenswrapper[4831]: I0309 16:04:35.043322 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 16:04:35 crc kubenswrapper[4831]: I0309 16:04:35.045490 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 16:04:36 crc kubenswrapper[4831]: I0309 16:04:36.057795 4831 generic.go:334] "Generic (PLEG): container finished" podID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerID="d2a9f36ba413b9f191923c5026ac9a9c7cecfadf5312f209da3d867a4930a14e" exitCode=0 Mar 09 16:04:36 crc kubenswrapper[4831]: I0309 16:04:36.057858 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerDied","Data":"d2a9f36ba413b9f191923c5026ac9a9c7cecfadf5312f209da3d867a4930a14e"} Mar 09 16:04:36 crc kubenswrapper[4831]: I0309 16:04:36.058464 4831 scope.go:117] "RemoveContainer" containerID="d2a9f36ba413b9f191923c5026ac9a9c7cecfadf5312f209da3d867a4930a14e" Mar 09 16:04:37 crc kubenswrapper[4831]: I0309 16:04:37.075353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerStarted","Data":"c324c41ddbd06e9e96a9a00fbe01e9116c8b9fef95624d9b94e8ed72cbe03911"} Mar 09 16:04:37 crc kubenswrapper[4831]: I0309 16:04:37.076471 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:04:37 crc kubenswrapper[4831]: I0309 16:04:37.082182 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:04:37 crc kubenswrapper[4831]: I0309 16:04:37.718961 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:04:43 crc kubenswrapper[4831]: I0309 16:04:43.245654 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:04:43 crc kubenswrapper[4831]: I0309 16:04:43.250111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:04:44 crc kubenswrapper[4831]: I0309 16:04:44.120918 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.324242 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551204-bqcbg"] Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.325675 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.328946 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.329130 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.329361 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.338796 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551204-bqcbg"] Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.374283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kq8\" (UniqueName: \"kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8\") pod \"auto-csr-approver-29551204-bqcbg\" (UID: \"f12621d7-3190-4c92-b62e-9f0e684ce767\") " pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.475505 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kq8\" (UniqueName: \"kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8\") pod \"auto-csr-approver-29551204-bqcbg\" (UID: \"f12621d7-3190-4c92-b62e-9f0e684ce767\") " pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.507636 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kq8\" (UniqueName: \"kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8\") pod \"auto-csr-approver-29551204-bqcbg\" (UID: \"f12621d7-3190-4c92-b62e-9f0e684ce767\") " pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:54 crc kubenswrapper[4831]: I0309 16:04:54.641506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:55 crc kubenswrapper[4831]: I0309 16:04:55.128299 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551204-bqcbg"] Mar 09 16:04:55 crc kubenswrapper[4831]: W0309 16:04:55.135747 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf12621d7_3190_4c92_b62e_9f0e684ce767.slice/crio-d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee WatchSource:0}: Error finding container d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee: Status 404 returned error can't find the container with id d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee Mar 09 16:04:55 crc kubenswrapper[4831]: I0309 16:04:55.214171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" event={"ID":"f12621d7-3190-4c92-b62e-9f0e684ce767","Type":"ContainerStarted","Data":"d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee"} Mar 09 16:04:57 crc kubenswrapper[4831]: I0309 16:04:57.227688 4831 generic.go:334] "Generic (PLEG): container finished" podID="f12621d7-3190-4c92-b62e-9f0e684ce767" containerID="83c2d419e353fdf32f96a626038be28ded202fbe7c1caa9d4de0aca90855caf8" exitCode=0 Mar 09 16:04:57 crc kubenswrapper[4831]: I0309 16:04:57.227755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" event={"ID":"f12621d7-3190-4c92-b62e-9f0e684ce767","Type":"ContainerDied","Data":"83c2d419e353fdf32f96a626038be28ded202fbe7c1caa9d4de0aca90855caf8"} Mar 09 16:04:58 crc kubenswrapper[4831]: I0309 16:04:58.523939 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:04:58 crc kubenswrapper[4831]: I0309 16:04:58.634725 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9kq8\" (UniqueName: \"kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8\") pod \"f12621d7-3190-4c92-b62e-9f0e684ce767\" (UID: \"f12621d7-3190-4c92-b62e-9f0e684ce767\") " Mar 09 16:04:58 crc kubenswrapper[4831]: I0309 16:04:58.641014 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8" (OuterVolumeSpecName: "kube-api-access-r9kq8") pod "f12621d7-3190-4c92-b62e-9f0e684ce767" (UID: "f12621d7-3190-4c92-b62e-9f0e684ce767"). InnerVolumeSpecName "kube-api-access-r9kq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:04:58 crc kubenswrapper[4831]: I0309 16:04:58.735811 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9kq8\" (UniqueName: \"kubernetes.io/projected/f12621d7-3190-4c92-b62e-9f0e684ce767-kube-api-access-r9kq8\") on node \"crc\" DevicePath \"\"" Mar 09 16:04:59 crc kubenswrapper[4831]: I0309 16:04:59.244840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" event={"ID":"f12621d7-3190-4c92-b62e-9f0e684ce767","Type":"ContainerDied","Data":"d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee"} Mar 09 16:04:59 crc kubenswrapper[4831]: I0309 16:04:59.244897 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18296b838bf22de092aae808d426b128d8bfda37a1a1729ac994bd460d13fee" Mar 09 16:04:59 crc kubenswrapper[4831]: I0309 16:04:59.244947 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551204-bqcbg" Mar 09 16:05:03 crc kubenswrapper[4831]: I0309 16:05:03.018711 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:05:03 crc kubenswrapper[4831]: I0309 16:05:03.019175 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:05:33 crc kubenswrapper[4831]: I0309 16:05:33.018647 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:05:33 crc kubenswrapper[4831]: I0309 16:05:33.019247 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.636906 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9z6ql"] Mar 09 16:05:36 crc kubenswrapper[4831]: E0309 16:05:36.638549 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12621d7-3190-4c92-b62e-9f0e684ce767" containerName="oc" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.638676 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12621d7-3190-4c92-b62e-9f0e684ce767" containerName="oc" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.638888 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12621d7-3190-4c92-b62e-9f0e684ce767" containerName="oc" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.639451 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.649642 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9z6ql"] Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729364 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729432 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fb7b3d2-2dae-45f4-b055-463f169c0b03-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729452 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-tls\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-trusted-ca\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729544 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-certificates\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8qv\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-kube-api-access-4h8qv\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729626 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fb7b3d2-2dae-45f4-b055-463f169c0b03-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.729669 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-bound-sa-token\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.754263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830561 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-certificates\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8qv\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-kube-api-access-4h8qv\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830627 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fb7b3d2-2dae-45f4-b055-463f169c0b03-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-bound-sa-token\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830707 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fb7b3d2-2dae-45f4-b055-463f169c0b03-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-tls\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.830745 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-trusted-ca\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.831252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3fb7b3d2-2dae-45f4-b055-463f169c0b03-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.831828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-certificates\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.832748 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fb7b3d2-2dae-45f4-b055-463f169c0b03-trusted-ca\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.837171 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3fb7b3d2-2dae-45f4-b055-463f169c0b03-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.837227 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-registry-tls\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.848991 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-bound-sa-token\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.849286 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8qv\" (UniqueName: \"kubernetes.io/projected/3fb7b3d2-2dae-45f4-b055-463f169c0b03-kube-api-access-4h8qv\") pod \"image-registry-66df7c8f76-9z6ql\" (UID: \"3fb7b3d2-2dae-45f4-b055-463f169c0b03\") " pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:36 crc kubenswrapper[4831]: I0309 16:05:36.956785 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:37 crc kubenswrapper[4831]: I0309 16:05:37.363928 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9z6ql"] Mar 09 16:05:37 crc kubenswrapper[4831]: I0309 16:05:37.462071 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" event={"ID":"3fb7b3d2-2dae-45f4-b055-463f169c0b03","Type":"ContainerStarted","Data":"cf9e718a4d7d7d410ac517d2cbcb27dee102e56ed230ee1e7d5d7f5ee1855e31"} Mar 09 16:05:38 crc kubenswrapper[4831]: I0309 16:05:38.471329 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" event={"ID":"3fb7b3d2-2dae-45f4-b055-463f169c0b03","Type":"ContainerStarted","Data":"c9c1f590033f83be21241bc5a47b7b19c49ea2b54e63a0b4b7e67b71d3f78b8d"} Mar 09 16:05:38 crc kubenswrapper[4831]: I0309 16:05:38.473391 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:38 crc kubenswrapper[4831]: I0309 16:05:38.496993 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" podStartSLOduration=2.496979379 podStartE2EDuration="2.496979379s" podCreationTimestamp="2026-03-09 16:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:05:38.491386527 +0000 UTC m=+465.625068950" watchObservedRunningTime="2026-03-09 16:05:38.496979379 +0000 UTC m=+465.630661802" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.144718 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.146974 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gbpnl" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="registry-server" containerID="cri-o://3178ecacf213c3dee197d8de880c32dd60b61e115c399d5ad5e82b068e2abb77" gracePeriod=30 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.155919 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.156297 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5rzt7" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="registry-server" containerID="cri-o://7039dd4fab36a97b97ac50d9e38551bc0c7b458e5a6489bdcb23d1dd52b541e7" gracePeriod=30 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.164023 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.164274 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" containerID="cri-o://c324c41ddbd06e9e96a9a00fbe01e9116c8b9fef95624d9b94e8ed72cbe03911" gracePeriod=30 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.177175 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.177895 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9rfth" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="registry-server" containerID="cri-o://5a08315e716b902ce353dbbe3ddb0abe6714d139864b939d28443f59e4cc904f" gracePeriod=30 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.188456 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.188706 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d4629" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="registry-server" containerID="cri-o://7bbf68197dbce2e98c0fcc985a29ab3dde6c5630c2745a5ec6b958a3dccf6a93" gracePeriod=30 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.195353 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-br9lf"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.196092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.212751 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-br9lf"] Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.312888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.313483 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbg5t\" (UniqueName: \"kubernetes.io/projected/130188b4-2b02-462a-9b83-cf6930ed2ea0-kube-api-access-sbg5t\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.313547 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.415291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.415340 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbg5t\" (UniqueName: \"kubernetes.io/projected/130188b4-2b02-462a-9b83-cf6930ed2ea0-kube-api-access-sbg5t\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.415381 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.418143 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.424437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/130188b4-2b02-462a-9b83-cf6930ed2ea0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.440271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbg5t\" (UniqueName: \"kubernetes.io/projected/130188b4-2b02-462a-9b83-cf6930ed2ea0-kube-api-access-sbg5t\") pod \"marketplace-operator-79b997595-br9lf\" (UID: \"130188b4-2b02-462a-9b83-cf6930ed2ea0\") " pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.522148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.541954 4831 generic.go:334] "Generic (PLEG): container finished" podID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerID="7039dd4fab36a97b97ac50d9e38551bc0c7b458e5a6489bdcb23d1dd52b541e7" exitCode=0 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.542074 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerDied","Data":"7039dd4fab36a97b97ac50d9e38551bc0c7b458e5a6489bdcb23d1dd52b541e7"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.542116 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rzt7" event={"ID":"93ec285c-3738-4b32-b6fc-abdf28c52c55","Type":"ContainerDied","Data":"365095f40576a62895edc0db0b0e5470c6d4d2c274954b102d144fce369d0c75"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.542170 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="365095f40576a62895edc0db0b0e5470c6d4d2c274954b102d144fce369d0c75" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.557750 4831 generic.go:334] "Generic (PLEG): container finished" podID="44dba940-7ade-48aa-91e5-a358ac696126" containerID="5a08315e716b902ce353dbbe3ddb0abe6714d139864b939d28443f59e4cc904f" exitCode=0 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.557831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerDied","Data":"5a08315e716b902ce353dbbe3ddb0abe6714d139864b939d28443f59e4cc904f"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.607666 4831 generic.go:334] "Generic (PLEG): container finished" podID="1cdec29c-2ba7-47b6-9446-95791b883267" containerID="7bbf68197dbce2e98c0fcc985a29ab3dde6c5630c2745a5ec6b958a3dccf6a93" exitCode=0 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.607740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerDied","Data":"7bbf68197dbce2e98c0fcc985a29ab3dde6c5630c2745a5ec6b958a3dccf6a93"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.610237 4831 generic.go:334] "Generic (PLEG): container finished" podID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerID="3178ecacf213c3dee197d8de880c32dd60b61e115c399d5ad5e82b068e2abb77" exitCode=0 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.610296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerDied","Data":"3178ecacf213c3dee197d8de880c32dd60b61e115c399d5ad5e82b068e2abb77"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.611820 4831 generic.go:334] "Generic (PLEG): container finished" podID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerID="c324c41ddbd06e9e96a9a00fbe01e9116c8b9fef95624d9b94e8ed72cbe03911" exitCode=0 Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.611846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerDied","Data":"c324c41ddbd06e9e96a9a00fbe01e9116c8b9fef95624d9b94e8ed72cbe03911"} Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.611883 4831 scope.go:117] "RemoveContainer" containerID="d2a9f36ba413b9f191923c5026ac9a9c7cecfadf5312f209da3d867a4930a14e" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.651841 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.676735 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.679259 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.703878 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.759869 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.822857 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities\") pod \"93ec285c-3738-4b32-b6fc-abdf28c52c55\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.822917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content\") pod \"44dba940-7ade-48aa-91e5-a358ac696126\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.822945 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zwp\" (UniqueName: \"kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp\") pod \"44dba940-7ade-48aa-91e5-a358ac696126\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.822990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities\") pod \"1ced3eb3-5570-485a-9828-4c509ecd19f2\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities\") pod \"44dba940-7ade-48aa-91e5-a358ac696126\" (UID: \"44dba940-7ade-48aa-91e5-a358ac696126\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content\") pod \"93ec285c-3738-4b32-b6fc-abdf28c52c55\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823085 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content\") pod \"1ced3eb3-5570-485a-9828-4c509ecd19f2\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823121 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf6lf\" (UniqueName: \"kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf\") pod \"93ec285c-3738-4b32-b6fc-abdf28c52c55\" (UID: \"93ec285c-3738-4b32-b6fc-abdf28c52c55\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823157 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca\") pod \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823179 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics\") pod \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqfjh\" (UniqueName: \"kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh\") pod \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\" (UID: \"f742f3f0-b95d-47f7-8554-5c22aa9c60f7\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.823244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57kkr\" (UniqueName: \"kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr\") pod \"1ced3eb3-5570-485a-9828-4c509ecd19f2\" (UID: \"1ced3eb3-5570-485a-9828-4c509ecd19f2\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.824046 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities" (OuterVolumeSpecName: "utilities") pod "93ec285c-3738-4b32-b6fc-abdf28c52c55" (UID: "93ec285c-3738-4b32-b6fc-abdf28c52c55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.829370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities" (OuterVolumeSpecName: "utilities") pod "1ced3eb3-5570-485a-9828-4c509ecd19f2" (UID: "1ced3eb3-5570-485a-9828-4c509ecd19f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.834652 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f742f3f0-b95d-47f7-8554-5c22aa9c60f7" (UID: "f742f3f0-b95d-47f7-8554-5c22aa9c60f7"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.843534 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities" (OuterVolumeSpecName: "utilities") pod "44dba940-7ade-48aa-91e5-a358ac696126" (UID: "44dba940-7ade-48aa-91e5-a358ac696126"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.846703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp" (OuterVolumeSpecName: "kube-api-access-t7zwp") pod "44dba940-7ade-48aa-91e5-a358ac696126" (UID: "44dba940-7ade-48aa-91e5-a358ac696126"). InnerVolumeSpecName "kube-api-access-t7zwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.863958 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f742f3f0-b95d-47f7-8554-5c22aa9c60f7" (UID: "f742f3f0-b95d-47f7-8554-5c22aa9c60f7"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.864257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf" (OuterVolumeSpecName: "kube-api-access-wf6lf") pod "93ec285c-3738-4b32-b6fc-abdf28c52c55" (UID: "93ec285c-3738-4b32-b6fc-abdf28c52c55"). InnerVolumeSpecName "kube-api-access-wf6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.867370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr" (OuterVolumeSpecName: "kube-api-access-57kkr") pod "1ced3eb3-5570-485a-9828-4c509ecd19f2" (UID: "1ced3eb3-5570-485a-9828-4c509ecd19f2"). InnerVolumeSpecName "kube-api-access-57kkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.885681 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh" (OuterVolumeSpecName: "kube-api-access-wqfjh") pod "f742f3f0-b95d-47f7-8554-5c22aa9c60f7" (UID: "f742f3f0-b95d-47f7-8554-5c22aa9c60f7"). InnerVolumeSpecName "kube-api-access-wqfjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.889546 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44dba940-7ade-48aa-91e5-a358ac696126" (UID: "44dba940-7ade-48aa-91e5-a358ac696126"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.922516 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ced3eb3-5570-485a-9828-4c509ecd19f2" (UID: "1ced3eb3-5570-485a-9828-4c509ecd19f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924339 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4bw\" (UniqueName: \"kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw\") pod \"1cdec29c-2ba7-47b6-9446-95791b883267\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924439 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities\") pod \"1cdec29c-2ba7-47b6-9446-95791b883267\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content\") pod \"1cdec29c-2ba7-47b6-9446-95791b883267\" (UID: \"1cdec29c-2ba7-47b6-9446-95791b883267\") " Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924761 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924773 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924782 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ced3eb3-5570-485a-9828-4c509ecd19f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924794 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf6lf\" (UniqueName: \"kubernetes.io/projected/93ec285c-3738-4b32-b6fc-abdf28c52c55-kube-api-access-wf6lf\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924803 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924812 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924821 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqfjh\" (UniqueName: \"kubernetes.io/projected/f742f3f0-b95d-47f7-8554-5c22aa9c60f7-kube-api-access-wqfjh\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924832 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57kkr\" (UniqueName: \"kubernetes.io/projected/1ced3eb3-5570-485a-9828-4c509ecd19f2-kube-api-access-57kkr\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924841 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924850 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dba940-7ade-48aa-91e5-a358ac696126-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.924858 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zwp\" (UniqueName: \"kubernetes.io/projected/44dba940-7ade-48aa-91e5-a358ac696126-kube-api-access-t7zwp\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.925445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities" (OuterVolumeSpecName: "utilities") pod "1cdec29c-2ba7-47b6-9446-95791b883267" (UID: "1cdec29c-2ba7-47b6-9446-95791b883267"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.927547 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw" (OuterVolumeSpecName: "kube-api-access-xs4bw") pod "1cdec29c-2ba7-47b6-9446-95791b883267" (UID: "1cdec29c-2ba7-47b6-9446-95791b883267"). InnerVolumeSpecName "kube-api-access-xs4bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:05:48 crc kubenswrapper[4831]: I0309 16:05:48.944044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ec285c-3738-4b32-b6fc-abdf28c52c55" (UID: "93ec285c-3738-4b32-b6fc-abdf28c52c55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.026157 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ec285c-3738-4b32-b6fc-abdf28c52c55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.026190 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4bw\" (UniqueName: \"kubernetes.io/projected/1cdec29c-2ba7-47b6-9446-95791b883267-kube-api-access-xs4bw\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.026202 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.032621 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-br9lf"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.068004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cdec29c-2ba7-47b6-9446-95791b883267" (UID: "1cdec29c-2ba7-47b6-9446-95791b883267"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.128121 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cdec29c-2ba7-47b6-9446-95791b883267-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.619310 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4629" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.624673 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbpnl" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.628648 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4629" event={"ID":"1cdec29c-2ba7-47b6-9446-95791b883267","Type":"ContainerDied","Data":"97a7a6f68411edc9438b6148fa9193ecdcd828663ee63666ffec01a9d7cdac1b"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.628726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbpnl" event={"ID":"1ced3eb3-5570-485a-9828-4c509ecd19f2","Type":"ContainerDied","Data":"31f8144c209e24c28a3821945d00e07ca3426b684827fe6144ff194d4d3557aa"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.628761 4831 scope.go:117] "RemoveContainer" containerID="7bbf68197dbce2e98c0fcc985a29ab3dde6c5630c2745a5ec6b958a3dccf6a93" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.630882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" event={"ID":"f742f3f0-b95d-47f7-8554-5c22aa9c60f7","Type":"ContainerDied","Data":"e9fd6d608eaa51f1a71bebccf735e56456be77dadc489720c1d15d8a0e5f1c31"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.630954 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lv25s" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.635556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" event={"ID":"130188b4-2b02-462a-9b83-cf6930ed2ea0","Type":"ContainerStarted","Data":"dea76ae35ccb72f5d50d96c128fd2a3e7e747110cd0b39f5e00832335db8b5da"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.635657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" event={"ID":"130188b4-2b02-462a-9b83-cf6930ed2ea0","Type":"ContainerStarted","Data":"35b3c16c30f24e5b697ef67607cdfb87379cfadbfdbdac9a68e00ec6263e0b6a"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.656858 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.657692 4831 scope.go:117] "RemoveContainer" containerID="981e9d8def2c2bff2d2987a1b189a7f7778b17162e0e031cbeda4e2670c0f653" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.661011 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rzt7" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.661914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rfth" event={"ID":"44dba940-7ade-48aa-91e5-a358ac696126","Type":"ContainerDied","Data":"26a5678493219447a690a7cd6c752ef4846596e60d89ca67093b7e2c2731a53d"} Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.663103 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rfth" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.689703 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.700020 4831 scope.go:117] "RemoveContainer" containerID="a529540ba0731cc548c65756a613acd3f817efafd943b361910810048b80c0d3" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.711596 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-br9lf" podStartSLOduration=1.711560368 podStartE2EDuration="1.711560368s" podCreationTimestamp="2026-03-09 16:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:05:49.705650966 +0000 UTC m=+476.839333389" watchObservedRunningTime="2026-03-09 16:05:49.711560368 +0000 UTC m=+476.845242791" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.724545 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.729693 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d4629"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.738820 4831 scope.go:117] "RemoveContainer" containerID="3178ecacf213c3dee197d8de880c32dd60b61e115c399d5ad5e82b068e2abb77" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.742119 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.749796 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gbpnl"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.756668 4831 scope.go:117] "RemoveContainer" containerID="8218ee1072dd7a15964d3371aafb4c61a4a06682deef4d8139e7aff6eed285a4" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.779566 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.783708 4831 scope.go:117] "RemoveContainer" containerID="7f1045c3112b483866d8d4d2a6f8915b54dbfd8d84f6d698612b592f4d50b55f" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.787883 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5rzt7"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.794519 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.804976 4831 scope.go:117] "RemoveContainer" containerID="c324c41ddbd06e9e96a9a00fbe01e9116c8b9fef95624d9b94e8ed72cbe03911" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.805171 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lv25s"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.810285 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.813049 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rfth"] Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.834642 4831 scope.go:117] "RemoveContainer" containerID="5a08315e716b902ce353dbbe3ddb0abe6714d139864b939d28443f59e4cc904f" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.857286 4831 scope.go:117] "RemoveContainer" containerID="756416447f5f3c2658dff1037ce89876436dadc185adee582083b1bf1d00d1e7" Mar 09 16:05:49 crc kubenswrapper[4831]: I0309 16:05:49.871803 4831 scope.go:117] "RemoveContainer" containerID="d4c9820ce84ac2cc05f61e1baca7589fa5f3bac0491f62511ef3266436faef91" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361256 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k56j6"] Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361576 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361599 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361615 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361625 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361641 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361650 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361667 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361676 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361690 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361699 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361713 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361724 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361738 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361748 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="extract-utilities" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361759 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361767 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361777 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361785 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361800 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361809 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361821 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361829 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361840 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361848 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.361866 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.361874 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="extract-content" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362020 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362036 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362049 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362062 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="44dba940-7ade-48aa-91e5-a358ac696126" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362072 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" containerName="registry-server" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362085 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: E0309 16:05:50.362207 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.362219 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" containerName="marketplace-operator" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.363134 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.365120 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.379062 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k56j6"] Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.484278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9lv\" (UniqueName: \"kubernetes.io/projected/19e8f5e0-340b-4e8d-8af0-ec018733fe09-kube-api-access-4b9lv\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.484350 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-catalog-content\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.484391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-utilities\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.560467 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2sr4l"] Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.561683 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.566289 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.573340 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sr4l"] Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.586911 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-utilities\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.587524 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9lv\" (UniqueName: \"kubernetes.io/projected/19e8f5e0-340b-4e8d-8af0-ec018733fe09-kube-api-access-4b9lv\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.587573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-catalog-content\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.587836 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-utilities\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.588121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e8f5e0-340b-4e8d-8af0-ec018733fe09-catalog-content\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.612657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9lv\" (UniqueName: \"kubernetes.io/projected/19e8f5e0-340b-4e8d-8af0-ec018733fe09-kube-api-access-4b9lv\") pod \"redhat-marketplace-k56j6\" (UID: \"19e8f5e0-340b-4e8d-8af0-ec018733fe09\") " pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.679625 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.689173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-utilities\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.689225 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqql4\" (UniqueName: \"kubernetes.io/projected/bc0c6900-0238-4664-a91c-057629716456-kube-api-access-rqql4\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.689277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-catalog-content\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.790965 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-catalog-content\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.791107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-utilities\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.791136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqql4\" (UniqueName: \"kubernetes.io/projected/bc0c6900-0238-4664-a91c-057629716456-kube-api-access-rqql4\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.791632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-catalog-content\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.793056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c6900-0238-4664-a91c-057629716456-utilities\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.808490 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqql4\" (UniqueName: \"kubernetes.io/projected/bc0c6900-0238-4664-a91c-057629716456-kube-api-access-rqql4\") pod \"redhat-operators-2sr4l\" (UID: \"bc0c6900-0238-4664-a91c-057629716456\") " pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.879896 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:05:50 crc kubenswrapper[4831]: I0309 16:05:50.897383 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k56j6"] Mar 09 16:05:50 crc kubenswrapper[4831]: W0309 16:05:50.909722 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e8f5e0_340b_4e8d_8af0_ec018733fe09.slice/crio-c6107d340327dcea05f946715502d15441904bca65c9d968a934f82dc66f14e6 WatchSource:0}: Error finding container c6107d340327dcea05f946715502d15441904bca65c9d968a934f82dc66f14e6: Status 404 returned error can't find the container with id c6107d340327dcea05f946715502d15441904bca65c9d968a934f82dc66f14e6 Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.142576 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sr4l"] Mar 09 16:05:51 crc kubenswrapper[4831]: W0309 16:05:51.176961 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0c6900_0238_4664_a91c_057629716456.slice/crio-b867336f1aa7464c21db760667be8afcbf45e793a3789dba11d792a89c039783 WatchSource:0}: Error finding container b867336f1aa7464c21db760667be8afcbf45e793a3789dba11d792a89c039783: Status 404 returned error can't find the container with id b867336f1aa7464c21db760667be8afcbf45e793a3789dba11d792a89c039783 Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.637290 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cdec29c-2ba7-47b6-9446-95791b883267" path="/var/lib/kubelet/pods/1cdec29c-2ba7-47b6-9446-95791b883267/volumes" Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.637959 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ced3eb3-5570-485a-9828-4c509ecd19f2" path="/var/lib/kubelet/pods/1ced3eb3-5570-485a-9828-4c509ecd19f2/volumes" Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.638680 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44dba940-7ade-48aa-91e5-a358ac696126" path="/var/lib/kubelet/pods/44dba940-7ade-48aa-91e5-a358ac696126/volumes" Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.639771 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ec285c-3738-4b32-b6fc-abdf28c52c55" path="/var/lib/kubelet/pods/93ec285c-3738-4b32-b6fc-abdf28c52c55/volumes" Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.640452 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f742f3f0-b95d-47f7-8554-5c22aa9c60f7" path="/var/lib/kubelet/pods/f742f3f0-b95d-47f7-8554-5c22aa9c60f7/volumes" Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.681869 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc0c6900-0238-4664-a91c-057629716456" containerID="b7741fa42775553446436a1f72a9742f3d4d77607107a383a62445d31a6a933c" exitCode=0 Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.681920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sr4l" event={"ID":"bc0c6900-0238-4664-a91c-057629716456","Type":"ContainerDied","Data":"b7741fa42775553446436a1f72a9742f3d4d77607107a383a62445d31a6a933c"} Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.681958 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sr4l" event={"ID":"bc0c6900-0238-4664-a91c-057629716456","Type":"ContainerStarted","Data":"b867336f1aa7464c21db760667be8afcbf45e793a3789dba11d792a89c039783"} Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.687335 4831 generic.go:334] "Generic (PLEG): container finished" podID="19e8f5e0-340b-4e8d-8af0-ec018733fe09" containerID="61dee1c409089c97d14b5c92b82a1cd6b5e46086cad5605e49e79b799b95eeab" exitCode=0 Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.687460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k56j6" event={"ID":"19e8f5e0-340b-4e8d-8af0-ec018733fe09","Type":"ContainerDied","Data":"61dee1c409089c97d14b5c92b82a1cd6b5e46086cad5605e49e79b799b95eeab"} Mar 09 16:05:51 crc kubenswrapper[4831]: I0309 16:05:51.687542 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k56j6" event={"ID":"19e8f5e0-340b-4e8d-8af0-ec018733fe09","Type":"ContainerStarted","Data":"c6107d340327dcea05f946715502d15441904bca65c9d968a934f82dc66f14e6"} Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.694799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k56j6" event={"ID":"19e8f5e0-340b-4e8d-8af0-ec018733fe09","Type":"ContainerStarted","Data":"532687cfc8efdee10eddb79f4665f1776f1ba83055e443b4fb686f3c21cbcfc6"} Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.755981 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qntgd"] Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.757269 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.759254 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.784057 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qntgd"] Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.852056 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62255\" (UniqueName: \"kubernetes.io/projected/806fda88-2f52-4fae-a835-3ffd3fd0e55e-kube-api-access-62255\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.852145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-utilities\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.852209 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-catalog-content\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.954750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62255\" (UniqueName: \"kubernetes.io/projected/806fda88-2f52-4fae-a835-3ffd3fd0e55e-kube-api-access-62255\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.955183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-utilities\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.955378 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-catalog-content\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.955692 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-utilities\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.955788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806fda88-2f52-4fae-a835-3ffd3fd0e55e-catalog-content\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.960224 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mx6mz"] Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.961598 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.966532 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.976170 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx6mz"] Mar 09 16:05:52 crc kubenswrapper[4831]: I0309 16:05:52.980022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62255\" (UniqueName: \"kubernetes.io/projected/806fda88-2f52-4fae-a835-3ffd3fd0e55e-kube-api-access-62255\") pod \"certified-operators-qntgd\" (UID: \"806fda88-2f52-4fae-a835-3ffd3fd0e55e\") " pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.072142 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.158844 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-catalog-content\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.159443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-utilities\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.159492 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4xt\" (UniqueName: \"kubernetes.io/projected/52031b53-29e3-4087-ac0a-db35877849bc-kube-api-access-zs4xt\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.261259 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-catalog-content\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.261333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-utilities\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.261373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4xt\" (UniqueName: \"kubernetes.io/projected/52031b53-29e3-4087-ac0a-db35877849bc-kube-api-access-zs4xt\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.262073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-catalog-content\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.262090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52031b53-29e3-4087-ac0a-db35877849bc-utilities\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.280040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4xt\" (UniqueName: \"kubernetes.io/projected/52031b53-29e3-4087-ac0a-db35877849bc-kube-api-access-zs4xt\") pod \"community-operators-mx6mz\" (UID: \"52031b53-29e3-4087-ac0a-db35877849bc\") " pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.296992 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qntgd"] Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.301286 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:05:53 crc kubenswrapper[4831]: W0309 16:05:53.304528 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod806fda88_2f52_4fae_a835_3ffd3fd0e55e.slice/crio-96a22e6dedfa716072a188c35d82a644b6138701feab61e29e9093667365788f WatchSource:0}: Error finding container 96a22e6dedfa716072a188c35d82a644b6138701feab61e29e9093667365788f: Status 404 returned error can't find the container with id 96a22e6dedfa716072a188c35d82a644b6138701feab61e29e9093667365788f Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.705947 4831 generic.go:334] "Generic (PLEG): container finished" podID="19e8f5e0-340b-4e8d-8af0-ec018733fe09" containerID="532687cfc8efdee10eddb79f4665f1776f1ba83055e443b4fb686f3c21cbcfc6" exitCode=0 Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.706392 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k56j6" event={"ID":"19e8f5e0-340b-4e8d-8af0-ec018733fe09","Type":"ContainerDied","Data":"532687cfc8efdee10eddb79f4665f1776f1ba83055e443b4fb686f3c21cbcfc6"} Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.709429 4831 generic.go:334] "Generic (PLEG): container finished" podID="806fda88-2f52-4fae-a835-3ffd3fd0e55e" containerID="d156d17bbd083ebee8f9722ddda285c41b9da2f61cb8c8b5258501c7244800b9" exitCode=0 Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.709486 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qntgd" event={"ID":"806fda88-2f52-4fae-a835-3ffd3fd0e55e","Type":"ContainerDied","Data":"d156d17bbd083ebee8f9722ddda285c41b9da2f61cb8c8b5258501c7244800b9"} Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.709514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qntgd" event={"ID":"806fda88-2f52-4fae-a835-3ffd3fd0e55e","Type":"ContainerStarted","Data":"96a22e6dedfa716072a188c35d82a644b6138701feab61e29e9093667365788f"} Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.713129 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx6mz"] Mar 09 16:05:53 crc kubenswrapper[4831]: I0309 16:05:53.723822 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sr4l" event={"ID":"bc0c6900-0238-4664-a91c-057629716456","Type":"ContainerStarted","Data":"e9c50a39db878dc1b186a359aaeca9709f5d24d96471dbb9756ca26c31609fd8"} Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.756662 4831 generic.go:334] "Generic (PLEG): container finished" podID="52031b53-29e3-4087-ac0a-db35877849bc" containerID="c326f8ad56463f620192f034ea2a43014eed6e569b5d6f47c3b93117a991ea8b" exitCode=0 Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.756784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx6mz" event={"ID":"52031b53-29e3-4087-ac0a-db35877849bc","Type":"ContainerDied","Data":"c326f8ad56463f620192f034ea2a43014eed6e569b5d6f47c3b93117a991ea8b"} Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.757975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx6mz" event={"ID":"52031b53-29e3-4087-ac0a-db35877849bc","Type":"ContainerStarted","Data":"c6ef45251f80c800ba58b106263f323529f097954464d8f031a4c77648e33d2c"} Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.766903 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc0c6900-0238-4664-a91c-057629716456" containerID="e9c50a39db878dc1b186a359aaeca9709f5d24d96471dbb9756ca26c31609fd8" exitCode=0 Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.767013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sr4l" event={"ID":"bc0c6900-0238-4664-a91c-057629716456","Type":"ContainerDied","Data":"e9c50a39db878dc1b186a359aaeca9709f5d24d96471dbb9756ca26c31609fd8"} Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.771817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k56j6" event={"ID":"19e8f5e0-340b-4e8d-8af0-ec018733fe09","Type":"ContainerStarted","Data":"7c756e6a2bc1a2b1de4e880de17522c523c815eb27f8018f8d571f16cc95d7ab"} Mar 09 16:05:54 crc kubenswrapper[4831]: I0309 16:05:54.853635 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k56j6" podStartSLOduration=2.374627545 podStartE2EDuration="4.853614222s" podCreationTimestamp="2026-03-09 16:05:50 +0000 UTC" firstStartedPulling="2026-03-09 16:05:51.690378507 +0000 UTC m=+478.824060930" lastFinishedPulling="2026-03-09 16:05:54.169365174 +0000 UTC m=+481.303047607" observedRunningTime="2026-03-09 16:05:54.852536051 +0000 UTC m=+481.986218484" watchObservedRunningTime="2026-03-09 16:05:54.853614222 +0000 UTC m=+481.987296655" Mar 09 16:05:55 crc kubenswrapper[4831]: I0309 16:05:55.806634 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sr4l" event={"ID":"bc0c6900-0238-4664-a91c-057629716456","Type":"ContainerStarted","Data":"b0c99a00709708b7b362011577aed9633b9950bbf181f5c94f08e2905c9c3797"} Mar 09 16:05:55 crc kubenswrapper[4831]: I0309 16:05:55.827451 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2sr4l" podStartSLOduration=2.332794632 podStartE2EDuration="5.82742351s" podCreationTimestamp="2026-03-09 16:05:50 +0000 UTC" firstStartedPulling="2026-03-09 16:05:51.685681571 +0000 UTC m=+478.819364014" lastFinishedPulling="2026-03-09 16:05:55.180310469 +0000 UTC m=+482.313992892" observedRunningTime="2026-03-09 16:05:55.821514658 +0000 UTC m=+482.955197081" watchObservedRunningTime="2026-03-09 16:05:55.82742351 +0000 UTC m=+482.961105933" Mar 09 16:05:56 crc kubenswrapper[4831]: I0309 16:05:56.814441 4831 generic.go:334] "Generic (PLEG): container finished" podID="52031b53-29e3-4087-ac0a-db35877849bc" containerID="6a591a13a1fe17859df2f91cf11b6b17545242f373e815cdb019e5941db03091" exitCode=0 Mar 09 16:05:56 crc kubenswrapper[4831]: I0309 16:05:56.814532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx6mz" event={"ID":"52031b53-29e3-4087-ac0a-db35877849bc","Type":"ContainerDied","Data":"6a591a13a1fe17859df2f91cf11b6b17545242f373e815cdb019e5941db03091"} Mar 09 16:05:56 crc kubenswrapper[4831]: I0309 16:05:56.963446 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-9z6ql" Mar 09 16:05:57 crc kubenswrapper[4831]: I0309 16:05:57.032716 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:05:57 crc kubenswrapper[4831]: I0309 16:05:57.824613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx6mz" event={"ID":"52031b53-29e3-4087-ac0a-db35877849bc","Type":"ContainerStarted","Data":"313b9d75a435d2849ec5ffa53f57d4d1a7b9f66ea74f1325b02daa572581db33"} Mar 09 16:05:57 crc kubenswrapper[4831]: I0309 16:05:57.847847 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mx6mz" podStartSLOduration=3.305097828 podStartE2EDuration="5.847820114s" podCreationTimestamp="2026-03-09 16:05:52 +0000 UTC" firstStartedPulling="2026-03-09 16:05:54.761013006 +0000 UTC m=+481.894695439" lastFinishedPulling="2026-03-09 16:05:57.303735302 +0000 UTC m=+484.437417725" observedRunningTime="2026-03-09 16:05:57.847790663 +0000 UTC m=+484.981473106" watchObservedRunningTime="2026-03-09 16:05:57.847820114 +0000 UTC m=+484.981502537" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.147362 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551206-mc5tk"] Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.148430 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.150282 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.151491 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.151549 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.158824 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551206-mc5tk"] Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.276449 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj7x5\" (UniqueName: \"kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5\") pod \"auto-csr-approver-29551206-mc5tk\" (UID: \"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad\") " pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.377842 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj7x5\" (UniqueName: \"kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5\") pod \"auto-csr-approver-29551206-mc5tk\" (UID: \"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad\") " pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.404882 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj7x5\" (UniqueName: \"kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5\") pod \"auto-csr-approver-29551206-mc5tk\" (UID: \"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad\") " pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:00 crc kubenswrapper[4831]: I0309 16:06:00.476233 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.681350 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.681816 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.719691 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551206-mc5tk"] Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.742697 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.846995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" event={"ID":"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad","Type":"ContainerStarted","Data":"a8089e51b41030ed3fe4297768ea5fc770a1fea112d2cdc077224920aac9f04c"} Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.848845 4831 generic.go:334] "Generic (PLEG): container finished" podID="806fda88-2f52-4fae-a835-3ffd3fd0e55e" containerID="fa5ba956d5f848d230ba1868ff83c29d60b2719b984fbba847bf1329d1d96703" exitCode=0 Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.848971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qntgd" event={"ID":"806fda88-2f52-4fae-a835-3ffd3fd0e55e","Type":"ContainerDied","Data":"fa5ba956d5f848d230ba1868ff83c29d60b2719b984fbba847bf1329d1d96703"} Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.881643 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.881776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:00.896561 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k56j6" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:01.858515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qntgd" event={"ID":"806fda88-2f52-4fae-a835-3ffd3fd0e55e","Type":"ContainerStarted","Data":"dd2a6023b6071c99cecc6fe22fba4b5f0243fab7fdbe7f2dcae185d718b68a61"} Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:01.880359 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qntgd" podStartSLOduration=2.089622652 podStartE2EDuration="9.880334475s" podCreationTimestamp="2026-03-09 16:05:52 +0000 UTC" firstStartedPulling="2026-03-09 16:05:53.714215932 +0000 UTC m=+480.847898355" lastFinishedPulling="2026-03-09 16:06:01.504927755 +0000 UTC m=+488.638610178" observedRunningTime="2026-03-09 16:06:01.879340196 +0000 UTC m=+489.013022639" watchObservedRunningTime="2026-03-09 16:06:01.880334475 +0000 UTC m=+489.014016898" Mar 09 16:06:01 crc kubenswrapper[4831]: I0309 16:06:01.915360 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2sr4l" podUID="bc0c6900-0238-4664-a91c-057629716456" containerName="registry-server" probeResult="failure" output=< Mar 09 16:06:01 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:06:01 crc kubenswrapper[4831]: > Mar 09 16:06:02 crc kubenswrapper[4831]: I0309 16:06:02.865342 4831 generic.go:334] "Generic (PLEG): container finished" podID="bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" containerID="3b7dee95c3c4555ef23055b340f42c0b5e78fd5d43d205d30e80386f667f8162" exitCode=0 Mar 09 16:06:02 crc kubenswrapper[4831]: I0309 16:06:02.865442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" event={"ID":"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad","Type":"ContainerDied","Data":"3b7dee95c3c4555ef23055b340f42c0b5e78fd5d43d205d30e80386f667f8162"} Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.018699 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.019017 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.019066 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.019654 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.019710 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077" gracePeriod=600 Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.072764 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.073510 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.301820 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.301914 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.365897 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.874171 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077" exitCode=0 Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.874331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077"} Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.874725 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449"} Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.874772 4831 scope.go:117] "RemoveContainer" containerID="f14c9a4152efec3899080b4ba87917e25713f1ad46f03ee1ff441a11bc0d38f7" Mar 09 16:06:03 crc kubenswrapper[4831]: I0309 16:06:03.919152 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mx6mz" Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.108316 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qntgd" podUID="806fda88-2f52-4fae-a835-3ffd3fd0e55e" containerName="registry-server" probeResult="failure" output=< Mar 09 16:06:04 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:06:04 crc kubenswrapper[4831]: > Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.111721 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.255811 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj7x5\" (UniqueName: \"kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5\") pod \"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad\" (UID: \"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad\") " Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.263977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5" (OuterVolumeSpecName: "kube-api-access-jj7x5") pod "bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" (UID: "bb22fe70-3840-4c85-8a94-f0cd3c67a6ad"). InnerVolumeSpecName "kube-api-access-jj7x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.356941 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj7x5\" (UniqueName: \"kubernetes.io/projected/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad-kube-api-access-jj7x5\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.887800 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.891807 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551206-mc5tk" event={"ID":"bb22fe70-3840-4c85-8a94-f0cd3c67a6ad","Type":"ContainerDied","Data":"a8089e51b41030ed3fe4297768ea5fc770a1fea112d2cdc077224920aac9f04c"} Mar 09 16:06:04 crc kubenswrapper[4831]: I0309 16:06:04.891848 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8089e51b41030ed3fe4297768ea5fc770a1fea112d2cdc077224920aac9f04c" Mar 09 16:06:05 crc kubenswrapper[4831]: I0309 16:06:05.171561 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551200-x7tx8"] Mar 09 16:06:05 crc kubenswrapper[4831]: I0309 16:06:05.176249 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551200-x7tx8"] Mar 09 16:06:05 crc kubenswrapper[4831]: I0309 16:06:05.627191 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a013f059-7440-4f06-88f6-a73f3286d228" path="/var/lib/kubelet/pods/a013f059-7440-4f06-88f6-a73f3286d228/volumes" Mar 09 16:06:10 crc kubenswrapper[4831]: I0309 16:06:10.918383 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:06:10 crc kubenswrapper[4831]: I0309 16:06:10.961999 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2sr4l" Mar 09 16:06:13 crc kubenswrapper[4831]: I0309 16:06:13.112795 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:06:13 crc kubenswrapper[4831]: I0309 16:06:13.163218 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qntgd" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.074565 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" podUID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" containerName="registry" containerID="cri-o://b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437" gracePeriod=30 Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.426092 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.530804 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.530938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.530976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.531253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.531305 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.531367 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.531426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9dgq\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.531481 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates\") pod \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\" (UID: \"643b13ec-dd30-4a47-b123-76c9c3a1b5b7\") " Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.532199 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.533300 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.538598 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.538626 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.538647 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq" (OuterVolumeSpecName: "kube-api-access-d9dgq") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "kube-api-access-d9dgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.541380 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.552068 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.560452 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "643b13ec-dd30-4a47-b123-76c9c3a1b5b7" (UID: "643b13ec-dd30-4a47-b123-76c9c3a1b5b7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633093 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633146 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9dgq\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-kube-api-access-d9dgq\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633166 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633177 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633190 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633201 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:22 crc kubenswrapper[4831]: I0309 16:06:22.633211 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/643b13ec-dd30-4a47-b123-76c9c3a1b5b7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.004651 4831 generic.go:334] "Generic (PLEG): container finished" podID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" containerID="b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437" exitCode=0 Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.004690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" event={"ID":"643b13ec-dd30-4a47-b123-76c9c3a1b5b7","Type":"ContainerDied","Data":"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437"} Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.004712 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" event={"ID":"643b13ec-dd30-4a47-b123-76c9c3a1b5b7","Type":"ContainerDied","Data":"3b931bb92a353621b063f98e2071f4f50b9bddae1d5f24ff55e213dfe044ec16"} Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.004722 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mbssj" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.004727 4831 scope.go:117] "RemoveContainer" containerID="b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.055806 4831 scope.go:117] "RemoveContainer" containerID="b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.056879 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:06:23 crc kubenswrapper[4831]: E0309 16:06:23.058635 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437\": container with ID starting with b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437 not found: ID does not exist" containerID="b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.058686 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437"} err="failed to get container status \"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437\": rpc error: code = NotFound desc = could not find container \"b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437\": container with ID starting with b1dce38f59f4f98b2d7dc3a10eeb36189ba36baae2695cbc18f34d8188b7f437 not found: ID does not exist" Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.066645 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mbssj"] Mar 09 16:06:23 crc kubenswrapper[4831]: I0309 16:06:23.625506 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" path="/var/lib/kubelet/pods/643b13ec-dd30-4a47-b123-76c9c3a1b5b7/volumes" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.133233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551208-pchsj"] Mar 09 16:08:00 crc kubenswrapper[4831]: E0309 16:08:00.134091 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" containerName="registry" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.134109 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" containerName="registry" Mar 09 16:08:00 crc kubenswrapper[4831]: E0309 16:08:00.134136 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" containerName="oc" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.134144 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" containerName="oc" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.134270 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" containerName="oc" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.134288 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b13ec-dd30-4a47-b123-76c9c3a1b5b7" containerName="registry" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.134710 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.137172 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.137180 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.138296 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.150221 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551208-pchsj"] Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.250312 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lqcm\" (UniqueName: \"kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm\") pod \"auto-csr-approver-29551208-pchsj\" (UID: \"e3b3f171-585c-4127-970e-da778a2f1d83\") " pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.351782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lqcm\" (UniqueName: \"kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm\") pod \"auto-csr-approver-29551208-pchsj\" (UID: \"e3b3f171-585c-4127-970e-da778a2f1d83\") " pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.376527 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lqcm\" (UniqueName: \"kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm\") pod \"auto-csr-approver-29551208-pchsj\" (UID: \"e3b3f171-585c-4127-970e-da778a2f1d83\") " pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.451671 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.687262 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551208-pchsj"] Mar 09 16:08:00 crc kubenswrapper[4831]: I0309 16:08:00.700700 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:08:01 crc kubenswrapper[4831]: I0309 16:08:01.669309 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551208-pchsj" event={"ID":"e3b3f171-585c-4127-970e-da778a2f1d83","Type":"ContainerStarted","Data":"ab32ccab88621eed3588c87b9b16b29e4dd31ceda10d182238f715a6d52a76c4"} Mar 09 16:08:02 crc kubenswrapper[4831]: I0309 16:08:02.676069 4831 generic.go:334] "Generic (PLEG): container finished" podID="e3b3f171-585c-4127-970e-da778a2f1d83" containerID="878379a99e9fd114576b8468d1a26af8e7f697904ffb0668e0cf5c42c793857d" exitCode=0 Mar 09 16:08:02 crc kubenswrapper[4831]: I0309 16:08:02.676132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551208-pchsj" event={"ID":"e3b3f171-585c-4127-970e-da778a2f1d83","Type":"ContainerDied","Data":"878379a99e9fd114576b8468d1a26af8e7f697904ffb0668e0cf5c42c793857d"} Mar 09 16:08:03 crc kubenswrapper[4831]: I0309 16:08:03.019284 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:08:03 crc kubenswrapper[4831]: I0309 16:08:03.019352 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:08:03 crc kubenswrapper[4831]: I0309 16:08:03.918839 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.026596 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lqcm\" (UniqueName: \"kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm\") pod \"e3b3f171-585c-4127-970e-da778a2f1d83\" (UID: \"e3b3f171-585c-4127-970e-da778a2f1d83\") " Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.033773 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm" (OuterVolumeSpecName: "kube-api-access-2lqcm") pod "e3b3f171-585c-4127-970e-da778a2f1d83" (UID: "e3b3f171-585c-4127-970e-da778a2f1d83"). InnerVolumeSpecName "kube-api-access-2lqcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.128558 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lqcm\" (UniqueName: \"kubernetes.io/projected/e3b3f171-585c-4127-970e-da778a2f1d83-kube-api-access-2lqcm\") on node \"crc\" DevicePath \"\"" Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.688337 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551208-pchsj" event={"ID":"e3b3f171-585c-4127-970e-da778a2f1d83","Type":"ContainerDied","Data":"ab32ccab88621eed3588c87b9b16b29e4dd31ceda10d182238f715a6d52a76c4"} Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.688379 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab32ccab88621eed3588c87b9b16b29e4dd31ceda10d182238f715a6d52a76c4" Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.688424 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551208-pchsj" Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.988726 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551202-rgtkz"] Mar 09 16:08:04 crc kubenswrapper[4831]: I0309 16:08:04.992072 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551202-rgtkz"] Mar 09 16:08:05 crc kubenswrapper[4831]: I0309 16:08:05.626282 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33678e26-b1b2-419f-93ef-85ba9e935155" path="/var/lib/kubelet/pods/33678e26-b1b2-419f-93ef-85ba9e935155/volumes" Mar 09 16:08:33 crc kubenswrapper[4831]: I0309 16:08:33.043657 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:08:33 crc kubenswrapper[4831]: I0309 16:08:33.044229 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:08:53 crc kubenswrapper[4831]: I0309 16:08:53.905305 4831 scope.go:117] "RemoveContainer" containerID="533d2358e13f31f47eafe8b323538343a36cfe938cd4e76fafc31a237d2bc94e" Mar 09 16:08:53 crc kubenswrapper[4831]: I0309 16:08:53.951021 4831 scope.go:117] "RemoveContainer" containerID="f0d6460c1557da868152743db7dc7bd72657fe2e1045b75281d92f71d0fa9b65" Mar 09 16:08:53 crc kubenswrapper[4831]: I0309 16:08:53.981383 4831 scope.go:117] "RemoveContainer" containerID="2d79d6e84601cfe1fa803e497d038b68ccfcc51af3b4dc406c11a1f3aaad2c5a" Mar 09 16:09:03 crc kubenswrapper[4831]: I0309 16:09:03.018968 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:09:03 crc kubenswrapper[4831]: I0309 16:09:03.019649 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:09:03 crc kubenswrapper[4831]: I0309 16:09:03.019733 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:09:03 crc kubenswrapper[4831]: I0309 16:09:03.020603 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:09:03 crc kubenswrapper[4831]: I0309 16:09:03.020700 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449" gracePeriod=600 Mar 09 16:09:04 crc kubenswrapper[4831]: I0309 16:09:04.085803 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449" exitCode=0 Mar 09 16:09:04 crc kubenswrapper[4831]: I0309 16:09:04.085910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449"} Mar 09 16:09:04 crc kubenswrapper[4831]: I0309 16:09:04.086308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea"} Mar 09 16:09:04 crc kubenswrapper[4831]: I0309 16:09:04.086331 4831 scope.go:117] "RemoveContainer" containerID="7d5b4725b752da5e408f875b6cd3b85c3aa6ec6e9210f6108fa23cf97bee9077" Mar 09 16:09:54 crc kubenswrapper[4831]: I0309 16:09:54.021274 4831 scope.go:117] "RemoveContainer" containerID="7039dd4fab36a97b97ac50d9e38551bc0c7b458e5a6489bdcb23d1dd52b541e7" Mar 09 16:09:54 crc kubenswrapper[4831]: I0309 16:09:54.037807 4831 scope.go:117] "RemoveContainer" containerID="a4e66708b18d563c97a50d20b20227e74d862a5c3a95746a75eba0032126c75b" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.139430 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551210-rtbsr"] Mar 09 16:10:00 crc kubenswrapper[4831]: E0309 16:10:00.140244 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b3f171-585c-4127-970e-da778a2f1d83" containerName="oc" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.140264 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b3f171-585c-4127-970e-da778a2f1d83" containerName="oc" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.140515 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b3f171-585c-4127-970e-da778a2f1d83" containerName="oc" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.141015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.143992 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.144151 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.144899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.147200 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551210-rtbsr"] Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.262102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbn9p\" (UniqueName: \"kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p\") pod \"auto-csr-approver-29551210-rtbsr\" (UID: \"e3436d06-d059-4710-9ab2-97360867646b\") " pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.364204 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbn9p\" (UniqueName: \"kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p\") pod \"auto-csr-approver-29551210-rtbsr\" (UID: \"e3436d06-d059-4710-9ab2-97360867646b\") " pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.398656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbn9p\" (UniqueName: \"kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p\") pod \"auto-csr-approver-29551210-rtbsr\" (UID: \"e3436d06-d059-4710-9ab2-97360867646b\") " pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.475520 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:00 crc kubenswrapper[4831]: I0309 16:10:00.740723 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551210-rtbsr"] Mar 09 16:10:01 crc kubenswrapper[4831]: I0309 16:10:01.432147 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" event={"ID":"e3436d06-d059-4710-9ab2-97360867646b","Type":"ContainerStarted","Data":"38051636d275cec67f8d794ccbfbe99d29ade6263000c413f0b40a879d764f01"} Mar 09 16:10:02 crc kubenswrapper[4831]: I0309 16:10:02.438062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" event={"ID":"e3436d06-d059-4710-9ab2-97360867646b","Type":"ContainerStarted","Data":"b33de5dbb8383c74b2f28df0a6747886dd4da73f5ecf531181710d39a6f3fb46"} Mar 09 16:10:02 crc kubenswrapper[4831]: I0309 16:10:02.454379 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" podStartSLOduration=1.186911714 podStartE2EDuration="2.454356045s" podCreationTimestamp="2026-03-09 16:10:00 +0000 UTC" firstStartedPulling="2026-03-09 16:10:00.763328612 +0000 UTC m=+727.897011035" lastFinishedPulling="2026-03-09 16:10:02.030772903 +0000 UTC m=+729.164455366" observedRunningTime="2026-03-09 16:10:02.451064331 +0000 UTC m=+729.584746764" watchObservedRunningTime="2026-03-09 16:10:02.454356045 +0000 UTC m=+729.588038468" Mar 09 16:10:03 crc kubenswrapper[4831]: I0309 16:10:03.445040 4831 generic.go:334] "Generic (PLEG): container finished" podID="e3436d06-d059-4710-9ab2-97360867646b" containerID="b33de5dbb8383c74b2f28df0a6747886dd4da73f5ecf531181710d39a6f3fb46" exitCode=0 Mar 09 16:10:03 crc kubenswrapper[4831]: I0309 16:10:03.445091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" event={"ID":"e3436d06-d059-4710-9ab2-97360867646b","Type":"ContainerDied","Data":"b33de5dbb8383c74b2f28df0a6747886dd4da73f5ecf531181710d39a6f3fb46"} Mar 09 16:10:04 crc kubenswrapper[4831]: I0309 16:10:04.701675 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:04 crc kubenswrapper[4831]: I0309 16:10:04.833318 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbn9p\" (UniqueName: \"kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p\") pod \"e3436d06-d059-4710-9ab2-97360867646b\" (UID: \"e3436d06-d059-4710-9ab2-97360867646b\") " Mar 09 16:10:04 crc kubenswrapper[4831]: I0309 16:10:04.848580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p" (OuterVolumeSpecName: "kube-api-access-jbn9p") pod "e3436d06-d059-4710-9ab2-97360867646b" (UID: "e3436d06-d059-4710-9ab2-97360867646b"). InnerVolumeSpecName "kube-api-access-jbn9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:10:04 crc kubenswrapper[4831]: I0309 16:10:04.939620 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbn9p\" (UniqueName: \"kubernetes.io/projected/e3436d06-d059-4710-9ab2-97360867646b-kube-api-access-jbn9p\") on node \"crc\" DevicePath \"\"" Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.467251 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" event={"ID":"e3436d06-d059-4710-9ab2-97360867646b","Type":"ContainerDied","Data":"38051636d275cec67f8d794ccbfbe99d29ade6263000c413f0b40a879d764f01"} Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.467329 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38051636d275cec67f8d794ccbfbe99d29ade6263000c413f0b40a879d764f01" Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.467369 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551210-rtbsr" Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.533508 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551204-bqcbg"] Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.538206 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551204-bqcbg"] Mar 09 16:10:05 crc kubenswrapper[4831]: I0309 16:10:05.624882 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12621d7-3190-4c92-b62e-9f0e684ce767" path="/var/lib/kubelet/pods/f12621d7-3190-4c92-b62e-9f0e684ce767/volumes" Mar 09 16:11:03 crc kubenswrapper[4831]: I0309 16:11:03.018909 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:11:03 crc kubenswrapper[4831]: I0309 16:11:03.020576 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.766673 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7jxjf"] Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.767854 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-controller" containerID="cri-o://1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768313 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="sbdb" containerID="cri-o://c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768394 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="nbdb" containerID="cri-o://0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768480 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="northd" containerID="cri-o://9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768520 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768557 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-node" containerID="cri-o://4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.768593 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-acl-logging" containerID="cri-o://ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.840377 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" containerID="cri-o://540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" gracePeriod=30 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.927823 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.932359 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-acl-logging/0.log" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.932920 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-controller/0.log" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933411 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" exitCode=0 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933452 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" exitCode=143 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933468 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" exitCode=143 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933505 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.933518 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.934795 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/2.log" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.935074 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/1.log" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.935107 4831 generic.go:334] "Generic (PLEG): container finished" podID="c53277d4-7695-47e5-bacc-e6ab6dca1501" containerID="b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062" exitCode=2 Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.935137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerDied","Data":"b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062"} Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.935171 4831 scope.go:117] "RemoveContainer" containerID="8f95930aa1d2dc586b5c7a228b01d4e9e15cd698cb1e1047e003334d6be6e7d6" Mar 09 16:11:15 crc kubenswrapper[4831]: I0309 16:11:15.935784 4831 scope.go:117] "RemoveContainer" containerID="b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062" Mar 09 16:11:15 crc kubenswrapper[4831]: E0309 16:11:15.936046 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9c746_openshift-multus(c53277d4-7695-47e5-bacc-e6ab6dca1501)\"" pod="openshift-multus/multus-9c746" podUID="c53277d4-7695-47e5-bacc-e6ab6dca1501" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.113460 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/4.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.114094 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.116261 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-acl-logging/0.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.116670 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-controller/0.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.117037 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171030 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9sx7h"] Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171252 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171265 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171274 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171280 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171286 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171294 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171303 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kubecfg-setup" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171308 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kubecfg-setup" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171317 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="nbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171323 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="nbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171335 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-acl-logging" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171344 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-acl-logging" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171353 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171360 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171373 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171381 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171411 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-node" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171419 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-node" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171430 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3436d06-d059-4710-9ab2-97360867646b" containerName="oc" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171438 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3436d06-d059-4710-9ab2-97360867646b" containerName="oc" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171445 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="sbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171453 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="sbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171464 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171472 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171480 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="northd" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171487 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="northd" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171574 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171583 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171593 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171601 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="sbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171609 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="nbdb" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171618 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3436d06-d059-4710-9ab2-97360867646b" containerName="oc" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171628 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171638 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171645 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="northd" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171656 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="kube-rbac-proxy-node" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171668 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovn-acl-logging" Mar 09 16:11:16 crc kubenswrapper[4831]: E0309 16:11:16.171788 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171799 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.171921 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.172091 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerName="ovnkube-controller" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.173443 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202517 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-systemd-units\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-env-overrides\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-log-socket\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-config\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202760 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202790 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-bin\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202820 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-slash\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-netd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-systemd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f865313a-5f09-4089-a116-c41ece13d6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.202929 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-script-lib\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-etc-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-kubelet\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-netns\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5lbk\" (UniqueName: \"kubernetes.io/projected/f865313a-5f09-4089-a116-c41ece13d6b8-kube-api-access-x5lbk\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203325 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203370 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-var-lib-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-node-log\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.203464 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-ovn\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.303801 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.303865 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.303902 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.303970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304023 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp564\" (UniqueName: \"kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304105 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304153 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304193 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304236 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304322 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304383 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304510 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304552 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304115 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304225 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304539 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304562 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket" (OuterVolumeSpecName: "log-socket") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304581 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash" (OuterVolumeSpecName: "host-slash") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304653 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304663 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304706 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert\") pod \"498bff7b-8be5-4e87-8717-0de7f7a8b877\" (UID: \"498bff7b-8be5-4e87-8717-0de7f7a8b877\") " Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log" (OuterVolumeSpecName: "node-log") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304943 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304967 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.304986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-var-lib-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-node-log\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-ovn\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-systemd-units\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-env-overrides\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305253 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-log-socket\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-config\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305340 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305342 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305444 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-bin\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-bin\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-ovn\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305548 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-slash\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-var-lib-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305592 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-node-log\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-netd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305740 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-log-socket\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305747 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-systemd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305767 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-run-systemd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305808 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-cni-netd\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305831 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-systemd-units\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.305947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f865313a-5f09-4089-a116-c41ece13d6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.306068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-slash\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.306329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-env-overrides\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-config\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307148 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-script-lib\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307204 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-etc-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307247 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-kubelet\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-netns\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5lbk\" (UniqueName: \"kubernetes.io/projected/f865313a-5f09-4089-a116-c41ece13d6b8-kube-api-access-x5lbk\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307514 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-etc-openvswitch\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-run-netns\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307589 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f865313a-5f09-4089-a116-c41ece13d6b8-host-kubelet\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f865313a-5f09-4089-a116-c41ece13d6b8-ovnkube-script-lib\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.307475 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308673 4831 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308700 4831 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308728 4831 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308754 4831 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308777 4831 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308801 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308825 4831 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308847 4831 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308869 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308891 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308913 4831 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308934 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308957 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.308983 4831 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.309006 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.309030 4831 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.312066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.312235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f865313a-5f09-4089-a116-c41ece13d6b8-ovn-node-metrics-cert\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.313849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564" (OuterVolumeSpecName: "kube-api-access-xp564") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "kube-api-access-xp564". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.330742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5lbk\" (UniqueName: \"kubernetes.io/projected/f865313a-5f09-4089-a116-c41ece13d6b8-kube-api-access-x5lbk\") pod \"ovnkube-node-9sx7h\" (UID: \"f865313a-5f09-4089-a116-c41ece13d6b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.334793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "498bff7b-8be5-4e87-8717-0de7f7a8b877" (UID: "498bff7b-8be5-4e87-8717-0de7f7a8b877"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.409942 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp564\" (UniqueName: \"kubernetes.io/projected/498bff7b-8be5-4e87-8717-0de7f7a8b877-kube-api-access-xp564\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.409983 4831 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/498bff7b-8be5-4e87-8717-0de7f7a8b877-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.409993 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/498bff7b-8be5-4e87-8717-0de7f7a8b877-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.488091 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:16 crc kubenswrapper[4831]: W0309 16:11:16.518832 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf865313a_5f09_4089_a116_c41ece13d6b8.slice/crio-e192c4d2d60039522756ed04eb83800b9ea581fad842db57d51786d4d5e81989 WatchSource:0}: Error finding container e192c4d2d60039522756ed04eb83800b9ea581fad842db57d51786d4d5e81989: Status 404 returned error can't find the container with id e192c4d2d60039522756ed04eb83800b9ea581fad842db57d51786d4d5e81989 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.942604 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/4.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.946265 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovnkube-controller/3.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.949254 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-acl-logging/0.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.949887 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7jxjf_498bff7b-8be5-4e87-8717-0de7f7a8b877/ovn-controller/0.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950625 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" exitCode=2 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950650 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" exitCode=0 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950660 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" exitCode=0 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950668 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" exitCode=0 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950668 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950681 4831 generic.go:334] "Generic (PLEG): container finished" podID="498bff7b-8be5-4e87-8717-0de7f7a8b877" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" exitCode=0 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950716 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950774 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" event={"ID":"498bff7b-8be5-4e87-8717-0de7f7a8b877","Type":"ContainerDied","Data":"b1659cb6d435b1f14250f65dff28760dfec76d127b9f93b040e2404abe8dcd21"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950787 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950805 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950815 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950830 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950838 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950844 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950851 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950856 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950862 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.950750 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.957794 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7jxjf" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.965376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerDied","Data":"8d9398fe601821b36f428659e138cd60a634716ada103ea86a701e35d9607e29"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.965467 4831 generic.go:334] "Generic (PLEG): container finished" podID="f865313a-5f09-4089-a116-c41ece13d6b8" containerID="8d9398fe601821b36f428659e138cd60a634716ada103ea86a701e35d9607e29" exitCode=0 Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.965611 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"e192c4d2d60039522756ed04eb83800b9ea581fad842db57d51786d4d5e81989"} Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.968261 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/2.log" Mar 09 16:11:16 crc kubenswrapper[4831]: I0309 16:11:16.984561 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.007478 4831 scope.go:117] "RemoveContainer" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.014933 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7jxjf"] Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.019785 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7jxjf"] Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.038333 4831 scope.go:117] "RemoveContainer" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.059023 4831 scope.go:117] "RemoveContainer" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.083247 4831 scope.go:117] "RemoveContainer" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.099450 4831 scope.go:117] "RemoveContainer" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.114014 4831 scope.go:117] "RemoveContainer" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.147807 4831 scope.go:117] "RemoveContainer" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.171149 4831 scope.go:117] "RemoveContainer" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.187443 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.188125 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.188156 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} err="failed to get container status \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.188182 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.189464 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": container with ID starting with 12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f not found: ID does not exist" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.189497 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} err="failed to get container status \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": rpc error: code = NotFound desc = could not find container \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": container with ID starting with 12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.189517 4831 scope.go:117] "RemoveContainer" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.189753 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": container with ID starting with c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7 not found: ID does not exist" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.189782 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} err="failed to get container status \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": rpc error: code = NotFound desc = could not find container \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": container with ID starting with c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.189797 4831 scope.go:117] "RemoveContainer" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.190134 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": container with ID starting with 0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df not found: ID does not exist" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190181 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} err="failed to get container status \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": rpc error: code = NotFound desc = could not find container \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": container with ID starting with 0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190218 4831 scope.go:117] "RemoveContainer" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.190536 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": container with ID starting with 9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c not found: ID does not exist" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190559 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} err="failed to get container status \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": rpc error: code = NotFound desc = could not find container \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": container with ID starting with 9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190573 4831 scope.go:117] "RemoveContainer" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.190803 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": container with ID starting with e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae not found: ID does not exist" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190845 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} err="failed to get container status \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": rpc error: code = NotFound desc = could not find container \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": container with ID starting with e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.190865 4831 scope.go:117] "RemoveContainer" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.191083 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": container with ID starting with 4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b not found: ID does not exist" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191105 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} err="failed to get container status \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": rpc error: code = NotFound desc = could not find container \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": container with ID starting with 4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191117 4831 scope.go:117] "RemoveContainer" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.191314 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": container with ID starting with ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf not found: ID does not exist" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191334 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} err="failed to get container status \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": rpc error: code = NotFound desc = could not find container \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": container with ID starting with ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191348 4831 scope.go:117] "RemoveContainer" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.191595 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": container with ID starting with 1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c not found: ID does not exist" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191622 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} err="failed to get container status \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": rpc error: code = NotFound desc = could not find container \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": container with ID starting with 1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191643 4831 scope.go:117] "RemoveContainer" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: E0309 16:11:17.191834 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": container with ID starting with 7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9 not found: ID does not exist" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191855 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} err="failed to get container status \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": rpc error: code = NotFound desc = could not find container \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": container with ID starting with 7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.191868 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192078 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} err="failed to get container status \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192100 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192303 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} err="failed to get container status \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": rpc error: code = NotFound desc = could not find container \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": container with ID starting with 12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192322 4831 scope.go:117] "RemoveContainer" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192610 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} err="failed to get container status \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": rpc error: code = NotFound desc = could not find container \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": container with ID starting with c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192634 4831 scope.go:117] "RemoveContainer" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192877 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} err="failed to get container status \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": rpc error: code = NotFound desc = could not find container \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": container with ID starting with 0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.192904 4831 scope.go:117] "RemoveContainer" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193138 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} err="failed to get container status \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": rpc error: code = NotFound desc = could not find container \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": container with ID starting with 9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193159 4831 scope.go:117] "RemoveContainer" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193364 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} err="failed to get container status \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": rpc error: code = NotFound desc = could not find container \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": container with ID starting with e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193390 4831 scope.go:117] "RemoveContainer" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193638 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} err="failed to get container status \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": rpc error: code = NotFound desc = could not find container \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": container with ID starting with 4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193660 4831 scope.go:117] "RemoveContainer" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193896 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} err="failed to get container status \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": rpc error: code = NotFound desc = could not find container \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": container with ID starting with ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.193924 4831 scope.go:117] "RemoveContainer" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194146 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} err="failed to get container status \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": rpc error: code = NotFound desc = could not find container \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": container with ID starting with 1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194164 4831 scope.go:117] "RemoveContainer" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194334 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} err="failed to get container status \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": rpc error: code = NotFound desc = could not find container \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": container with ID starting with 7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194358 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194617 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} err="failed to get container status \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194641 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194830 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} err="failed to get container status \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": rpc error: code = NotFound desc = could not find container \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": container with ID starting with 12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.194850 4831 scope.go:117] "RemoveContainer" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195008 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} err="failed to get container status \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": rpc error: code = NotFound desc = could not find container \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": container with ID starting with c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195031 4831 scope.go:117] "RemoveContainer" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195167 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} err="failed to get container status \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": rpc error: code = NotFound desc = could not find container \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": container with ID starting with 0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195192 4831 scope.go:117] "RemoveContainer" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195336 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} err="failed to get container status \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": rpc error: code = NotFound desc = could not find container \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": container with ID starting with 9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195353 4831 scope.go:117] "RemoveContainer" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195535 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} err="failed to get container status \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": rpc error: code = NotFound desc = could not find container \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": container with ID starting with e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195554 4831 scope.go:117] "RemoveContainer" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195773 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} err="failed to get container status \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": rpc error: code = NotFound desc = could not find container \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": container with ID starting with 4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195796 4831 scope.go:117] "RemoveContainer" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.195992 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} err="failed to get container status \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": rpc error: code = NotFound desc = could not find container \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": container with ID starting with ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196005 4831 scope.go:117] "RemoveContainer" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196253 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} err="failed to get container status \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": rpc error: code = NotFound desc = could not find container \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": container with ID starting with 1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196275 4831 scope.go:117] "RemoveContainer" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196596 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} err="failed to get container status \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": rpc error: code = NotFound desc = could not find container \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": container with ID starting with 7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196618 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196858 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} err="failed to get container status \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.196883 4831 scope.go:117] "RemoveContainer" containerID="12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197103 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f"} err="failed to get container status \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": rpc error: code = NotFound desc = could not find container \"12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f\": container with ID starting with 12e158123a3068ab2abceba61cd116973ba68d7767715c0c13e1f6bdde4ff47f not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197130 4831 scope.go:117] "RemoveContainer" containerID="c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197320 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7"} err="failed to get container status \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": rpc error: code = NotFound desc = could not find container \"c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7\": container with ID starting with c24540f6524d108fadfc30dd805470559249130e3ff2a1ae9db19fd3b49217f7 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197343 4831 scope.go:117] "RemoveContainer" containerID="0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197536 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df"} err="failed to get container status \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": rpc error: code = NotFound desc = could not find container \"0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df\": container with ID starting with 0e99ddb7e8b7689ee7e458bd6aef91cdc680a7a196037252f5f357a2f49771df not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197559 4831 scope.go:117] "RemoveContainer" containerID="9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197728 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c"} err="failed to get container status \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": rpc error: code = NotFound desc = could not find container \"9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c\": container with ID starting with 9983b165a918c38b6c94874d56412306f4ded8faca0554ce0928e4335bb72c3c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197750 4831 scope.go:117] "RemoveContainer" containerID="e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197947 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae"} err="failed to get container status \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": rpc error: code = NotFound desc = could not find container \"e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae\": container with ID starting with e4594f29648a43a892bfebf66b77d9c984a95cea0a206c1f9a1f2e0542a49cae not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.197970 4831 scope.go:117] "RemoveContainer" containerID="4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198151 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b"} err="failed to get container status \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": rpc error: code = NotFound desc = could not find container \"4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b\": container with ID starting with 4c7c93f151eb7b3e3c2e154e52dc657cfa7fcf560af5933fa1d070761c146d1b not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198177 4831 scope.go:117] "RemoveContainer" containerID="ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198360 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf"} err="failed to get container status \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": rpc error: code = NotFound desc = could not find container \"ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf\": container with ID starting with ffe48777925e0f112171a9e0e8d19f438731d4f69b318d473a31c13f8d1f6faf not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198384 4831 scope.go:117] "RemoveContainer" containerID="1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198598 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c"} err="failed to get container status \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": rpc error: code = NotFound desc = could not find container \"1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c\": container with ID starting with 1e37d8817d781e8e1825ce154039dda637d139613da08ac77ae349cc5677c58c not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198622 4831 scope.go:117] "RemoveContainer" containerID="7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198780 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9"} err="failed to get container status \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": rpc error: code = NotFound desc = could not find container \"7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9\": container with ID starting with 7d2f4f084d75b071f869a9be345b167e8b54d65d96d4bffe131c5310ebe639d9 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198804 4831 scope.go:117] "RemoveContainer" containerID="540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.198992 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128"} err="failed to get container status \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": rpc error: code = NotFound desc = could not find container \"540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128\": container with ID starting with 540b1e89b2eacf680b1f7ab165cfd59a88a557623dbd52d9378d92e5d409c128 not found: ID does not exist" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.625916 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498bff7b-8be5-4e87-8717-0de7f7a8b877" path="/var/lib/kubelet/pods/498bff7b-8be5-4e87-8717-0de7f7a8b877/volumes" Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"c2572b6122267fb198400c0917893f704ebf816288c386131c5ae7780afc4f6d"} Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978839 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"bec5dc542231f16ebdb9318f84c72e86fba16d3209ca13358361da5681715495"} Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"4025a78c10bff093e1e8856554208ac9a97b978d91766fcb9a62947433f7b98c"} Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"7439555d751261f7c821939bc456711074a7df7d49f68867dadef9248bc10bdc"} Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"d1fef753428a6e46955529da791d5919abce4c447345aa8d2fca61f68cb19758"} Mar 09 16:11:17 crc kubenswrapper[4831]: I0309 16:11:17.978892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"f5fea6c1dcb82a39316635b3b6e2583a52012e371453790f26319baaa9191625"} Mar 09 16:11:20 crc kubenswrapper[4831]: I0309 16:11:20.004702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"f6ca094826978451a3edde4ab62cec74f528755d53f137f461178097156c19e9"} Mar 09 16:11:22 crc kubenswrapper[4831]: I0309 16:11:22.026551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" event={"ID":"f865313a-5f09-4089-a116-c41ece13d6b8","Type":"ContainerStarted","Data":"33e9022473895a68d2d9ef124003cd00facbeed11f6232e51f1ac2deed3ea358"} Mar 09 16:11:22 crc kubenswrapper[4831]: I0309 16:11:22.026943 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:22 crc kubenswrapper[4831]: I0309 16:11:22.027268 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:22 crc kubenswrapper[4831]: I0309 16:11:22.059611 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:22 crc kubenswrapper[4831]: I0309 16:11:22.063385 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" podStartSLOduration=6.06336539 podStartE2EDuration="6.06336539s" podCreationTimestamp="2026-03-09 16:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:11:22.060618971 +0000 UTC m=+809.194301414" watchObservedRunningTime="2026-03-09 16:11:22.06336539 +0000 UTC m=+809.197047813" Mar 09 16:11:23 crc kubenswrapper[4831]: I0309 16:11:23.031468 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:23 crc kubenswrapper[4831]: I0309 16:11:23.055633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:30 crc kubenswrapper[4831]: I0309 16:11:30.617528 4831 scope.go:117] "RemoveContainer" containerID="b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062" Mar 09 16:11:30 crc kubenswrapper[4831]: E0309 16:11:30.618503 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9c746_openshift-multus(c53277d4-7695-47e5-bacc-e6ab6dca1501)\"" pod="openshift-multus/multus-9c746" podUID="c53277d4-7695-47e5-bacc-e6ab6dca1501" Mar 09 16:11:33 crc kubenswrapper[4831]: I0309 16:11:33.018645 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:11:33 crc kubenswrapper[4831]: I0309 16:11:33.018946 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.514443 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275"] Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.515972 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.519211 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.520185 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275"] Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.659200 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.659288 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.659329 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whh27\" (UniqueName: \"kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.761434 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whh27\" (UniqueName: \"kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.761620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.761702 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.762231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.762291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.783874 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whh27\" (UniqueName: \"kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: I0309 16:11:35.833651 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: E0309 16:11:35.873874 4831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(01f042e8df6390b9840d1aefe395bbc283ff8204ee18a594fd3711949aa9147e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:11:35 crc kubenswrapper[4831]: E0309 16:11:35.873982 4831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(01f042e8df6390b9840d1aefe395bbc283ff8204ee18a594fd3711949aa9147e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: E0309 16:11:35.874022 4831 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(01f042e8df6390b9840d1aefe395bbc283ff8204ee18a594fd3711949aa9147e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:35 crc kubenswrapper[4831]: E0309 16:11:35.874109 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace(5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace(5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(01f042e8df6390b9840d1aefe395bbc283ff8204ee18a594fd3711949aa9147e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" Mar 09 16:11:36 crc kubenswrapper[4831]: I0309 16:11:36.103608 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:36 crc kubenswrapper[4831]: I0309 16:11:36.104609 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:36 crc kubenswrapper[4831]: E0309 16:11:36.125533 4831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(e9ea85ac063fccfe85fa26c322e3dc92fa7249d3cf58300ea77a3146880ac47c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 16:11:36 crc kubenswrapper[4831]: E0309 16:11:36.125608 4831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(e9ea85ac063fccfe85fa26c322e3dc92fa7249d3cf58300ea77a3146880ac47c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:36 crc kubenswrapper[4831]: E0309 16:11:36.125640 4831 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(e9ea85ac063fccfe85fa26c322e3dc92fa7249d3cf58300ea77a3146880ac47c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:36 crc kubenswrapper[4831]: E0309 16:11:36.125712 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace(5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace(5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_openshift-marketplace_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0_0(e9ea85ac063fccfe85fa26c322e3dc92fa7249d3cf58300ea77a3146880ac47c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" Mar 09 16:11:45 crc kubenswrapper[4831]: I0309 16:11:45.617818 4831 scope.go:117] "RemoveContainer" containerID="b0d8e1fbe63294dc472671512e54914adf4171dc77407e3355faa93329952062" Mar 09 16:11:46 crc kubenswrapper[4831]: I0309 16:11:46.175618 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c746_c53277d4-7695-47e5-bacc-e6ab6dca1501/kube-multus/2.log" Mar 09 16:11:46 crc kubenswrapper[4831]: I0309 16:11:46.175978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c746" event={"ID":"c53277d4-7695-47e5-bacc-e6ab6dca1501","Type":"ContainerStarted","Data":"85dab57e38339f7bccb1d7710943fdccce49c051281a77f822da315b25ef36b1"} Mar 09 16:11:46 crc kubenswrapper[4831]: I0309 16:11:46.512648 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sx7h" Mar 09 16:11:50 crc kubenswrapper[4831]: I0309 16:11:50.616939 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:50 crc kubenswrapper[4831]: I0309 16:11:50.617634 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:51 crc kubenswrapper[4831]: I0309 16:11:51.117200 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275"] Mar 09 16:11:51 crc kubenswrapper[4831]: W0309 16:11:51.126739 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5892b2b0_41c8_4fbc_8a19_ca44a6a35ed0.slice/crio-de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c WatchSource:0}: Error finding container de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c: Status 404 returned error can't find the container with id de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c Mar 09 16:11:51 crc kubenswrapper[4831]: I0309 16:11:51.206626 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerStarted","Data":"de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c"} Mar 09 16:11:52 crc kubenswrapper[4831]: I0309 16:11:52.214673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerStarted","Data":"b99e6b1e0b2ef73b6fab5bf3c736cd48b2bf2856d5c5880d9ba6804bb353492b"} Mar 09 16:11:53 crc kubenswrapper[4831]: I0309 16:11:53.222111 4831 generic.go:334] "Generic (PLEG): container finished" podID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerID="b99e6b1e0b2ef73b6fab5bf3c736cd48b2bf2856d5c5880d9ba6804bb353492b" exitCode=0 Mar 09 16:11:53 crc kubenswrapper[4831]: I0309 16:11:53.222195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerDied","Data":"b99e6b1e0b2ef73b6fab5bf3c736cd48b2bf2856d5c5880d9ba6804bb353492b"} Mar 09 16:11:54 crc kubenswrapper[4831]: I0309 16:11:54.085915 4831 scope.go:117] "RemoveContainer" containerID="83c2d419e353fdf32f96a626038be28ded202fbe7c1caa9d4de0aca90855caf8" Mar 09 16:11:56 crc kubenswrapper[4831]: I0309 16:11:56.249950 4831 generic.go:334] "Generic (PLEG): container finished" podID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerID="03c82d46fcf849a44b083469b5ca0453bcb185f8bb7525826216fc837df2abaf" exitCode=0 Mar 09 16:11:56 crc kubenswrapper[4831]: I0309 16:11:56.250026 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerDied","Data":"03c82d46fcf849a44b083469b5ca0453bcb185f8bb7525826216fc837df2abaf"} Mar 09 16:11:57 crc kubenswrapper[4831]: I0309 16:11:57.259192 4831 generic.go:334] "Generic (PLEG): container finished" podID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerID="9c3ad1bc01a5789058667b6c28a393a8c158066f6d86cfa684ce7862c49d3ef1" exitCode=0 Mar 09 16:11:57 crc kubenswrapper[4831]: I0309 16:11:57.259301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerDied","Data":"9c3ad1bc01a5789058667b6c28a393a8c158066f6d86cfa684ce7862c49d3ef1"} Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.493391 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.673033 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whh27\" (UniqueName: \"kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27\") pod \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.673107 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle\") pod \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.673129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util\") pod \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\" (UID: \"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0\") " Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.675112 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle" (OuterVolumeSpecName: "bundle") pod "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" (UID: "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.682839 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27" (OuterVolumeSpecName: "kube-api-access-whh27") pod "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" (UID: "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0"). InnerVolumeSpecName "kube-api-access-whh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.698330 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util" (OuterVolumeSpecName: "util") pod "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" (UID: "5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.774600 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whh27\" (UniqueName: \"kubernetes.io/projected/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-kube-api-access-whh27\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.774645 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:58 crc kubenswrapper[4831]: I0309 16:11:58.774658 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:11:59 crc kubenswrapper[4831]: I0309 16:11:59.271131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" event={"ID":"5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0","Type":"ContainerDied","Data":"de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c"} Mar 09 16:11:59 crc kubenswrapper[4831]: I0309 16:11:59.271187 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de46ea16560eb8dde7ba2ff11922a98e674ed0b5ea68a3349b5381a598b3ea2c" Mar 09 16:11:59 crc kubenswrapper[4831]: I0309 16:11:59.271218 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.158175 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551212-j58bc"] Mar 09 16:12:00 crc kubenswrapper[4831]: E0309 16:12:00.158538 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="util" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.158560 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="util" Mar 09 16:12:00 crc kubenswrapper[4831]: E0309 16:12:00.158579 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="pull" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.158587 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="pull" Mar 09 16:12:00 crc kubenswrapper[4831]: E0309 16:12:00.158612 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="extract" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.158630 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="extract" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.158813 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0" containerName="extract" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.159256 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.162127 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.162313 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.163609 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.163964 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551212-j58bc"] Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.194827 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxm6\" (UniqueName: \"kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6\") pod \"auto-csr-approver-29551212-j58bc\" (UID: \"3bee73df-f272-4e5f-9879-3c0ded43f3ab\") " pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.296347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkxm6\" (UniqueName: \"kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6\") pod \"auto-csr-approver-29551212-j58bc\" (UID: \"3bee73df-f272-4e5f-9879-3c0ded43f3ab\") " pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.315560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkxm6\" (UniqueName: \"kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6\") pod \"auto-csr-approver-29551212-j58bc\" (UID: \"3bee73df-f272-4e5f-9879-3c0ded43f3ab\") " pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:00 crc kubenswrapper[4831]: I0309 16:12:00.489005 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:01 crc kubenswrapper[4831]: I0309 16:12:01.069224 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551212-j58bc"] Mar 09 16:12:01 crc kubenswrapper[4831]: W0309 16:12:01.079356 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bee73df_f272_4e5f_9879_3c0ded43f3ab.slice/crio-531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7 WatchSource:0}: Error finding container 531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7: Status 404 returned error can't find the container with id 531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7 Mar 09 16:12:01 crc kubenswrapper[4831]: I0309 16:12:01.282227 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551212-j58bc" event={"ID":"3bee73df-f272-4e5f-9879-3c0ded43f3ab","Type":"ContainerStarted","Data":"531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7"} Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.018285 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.018687 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.018741 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.019424 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.019485 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea" gracePeriod=600 Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.294550 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551212-j58bc" event={"ID":"3bee73df-f272-4e5f-9879-3c0ded43f3ab","Type":"ContainerStarted","Data":"81196dc320296aaf9ae38b8a4fca71ef7f1383ff61dd0db391d67d1656f182d4"} Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.297454 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea" exitCode=0 Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.297493 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea"} Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.297526 4831 scope.go:117] "RemoveContainer" containerID="1d5b3a31bd2d45e3d8a6e232faf4a5329e04814733b1a2c4ac587ab6fa664449" Mar 09 16:12:03 crc kubenswrapper[4831]: I0309 16:12:03.311065 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551212-j58bc" podStartSLOduration=1.4766242840000001 podStartE2EDuration="3.311047214s" podCreationTimestamp="2026-03-09 16:12:00 +0000 UTC" firstStartedPulling="2026-03-09 16:12:01.081896308 +0000 UTC m=+848.215578731" lastFinishedPulling="2026-03-09 16:12:02.916319238 +0000 UTC m=+850.050001661" observedRunningTime="2026-03-09 16:12:03.308325246 +0000 UTC m=+850.442007669" watchObservedRunningTime="2026-03-09 16:12:03.311047214 +0000 UTC m=+850.444729637" Mar 09 16:12:04 crc kubenswrapper[4831]: I0309 16:12:04.303949 4831 generic.go:334] "Generic (PLEG): container finished" podID="3bee73df-f272-4e5f-9879-3c0ded43f3ab" containerID="81196dc320296aaf9ae38b8a4fca71ef7f1383ff61dd0db391d67d1656f182d4" exitCode=0 Mar 09 16:12:04 crc kubenswrapper[4831]: I0309 16:12:04.304036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551212-j58bc" event={"ID":"3bee73df-f272-4e5f-9879-3c0ded43f3ab","Type":"ContainerDied","Data":"81196dc320296aaf9ae38b8a4fca71ef7f1383ff61dd0db391d67d1656f182d4"} Mar 09 16:12:04 crc kubenswrapper[4831]: I0309 16:12:04.306019 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab"} Mar 09 16:12:05 crc kubenswrapper[4831]: I0309 16:12:05.555506 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:05 crc kubenswrapper[4831]: I0309 16:12:05.717175 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkxm6\" (UniqueName: \"kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6\") pod \"3bee73df-f272-4e5f-9879-3c0ded43f3ab\" (UID: \"3bee73df-f272-4e5f-9879-3c0ded43f3ab\") " Mar 09 16:12:05 crc kubenswrapper[4831]: I0309 16:12:05.729817 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6" (OuterVolumeSpecName: "kube-api-access-kkxm6") pod "3bee73df-f272-4e5f-9879-3c0ded43f3ab" (UID: "3bee73df-f272-4e5f-9879-3c0ded43f3ab"). InnerVolumeSpecName "kube-api-access-kkxm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:12:05 crc kubenswrapper[4831]: I0309 16:12:05.819377 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkxm6\" (UniqueName: \"kubernetes.io/projected/3bee73df-f272-4e5f-9879-3c0ded43f3ab-kube-api-access-kkxm6\") on node \"crc\" DevicePath \"\"" Mar 09 16:12:06 crc kubenswrapper[4831]: I0309 16:12:06.318748 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551212-j58bc" event={"ID":"3bee73df-f272-4e5f-9879-3c0ded43f3ab","Type":"ContainerDied","Data":"531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7"} Mar 09 16:12:06 crc kubenswrapper[4831]: I0309 16:12:06.318783 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551212-j58bc" Mar 09 16:12:06 crc kubenswrapper[4831]: I0309 16:12:06.318797 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531eca6ee60d6bacba5034fb4c6d7d8eee7e0481a1416a80f5cc1a0bf4600ff7" Mar 09 16:12:06 crc kubenswrapper[4831]: I0309 16:12:06.378308 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551206-mc5tk"] Mar 09 16:12:06 crc kubenswrapper[4831]: I0309 16:12:06.381773 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551206-mc5tk"] Mar 09 16:12:07 crc kubenswrapper[4831]: I0309 16:12:07.623033 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb22fe70-3840-4c85-8a94-f0cd3c67a6ad" path="/var/lib/kubelet/pods/bb22fe70-3840-4c85-8a94-f0cd3c67a6ad/volumes" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.242651 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr"] Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.243201 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bee73df-f272-4e5f-9879-3c0ded43f3ab" containerName="oc" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.243224 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bee73df-f272-4e5f-9879-3c0ded43f3ab" containerName="oc" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.243347 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bee73df-f272-4e5f-9879-3c0ded43f3ab" containerName="oc" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.243858 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: W0309 16:12:08.245517 4831 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.245571 4831 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 16:12:08 crc kubenswrapper[4831]: W0309 16:12:08.245597 4831 reflector.go:561] object-"metallb-system"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.245641 4831 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 16:12:08 crc kubenswrapper[4831]: W0309 16:12:08.245884 4831 reflector.go:561] object-"metallb-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.245971 4831 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 16:12:08 crc kubenswrapper[4831]: W0309 16:12:08.248708 4831 reflector.go:561] object-"metallb-system"/"manager-account-dockercfg-dnw69": failed to list *v1.Secret: secrets "manager-account-dockercfg-dnw69" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.248773 4831 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"manager-account-dockercfg-dnw69\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"manager-account-dockercfg-dnw69\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 16:12:08 crc kubenswrapper[4831]: W0309 16:12:08.248714 4831 reflector.go:561] object-"metallb-system"/"metallb-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "metallb-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 09 16:12:08 crc kubenswrapper[4831]: E0309 16:12:08.248807 4831 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.274274 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr"] Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.351633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.351676 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.351702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bc57\" (UniqueName: \"kubernetes.io/projected/49b4858c-b806-4899-9332-1a23e118cf9e-kube-api-access-2bc57\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.452929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.453190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.453303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bc57\" (UniqueName: \"kubernetes.io/projected/49b4858c-b806-4899-9332-1a23e118cf9e-kube-api-access-2bc57\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.484128 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl"] Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.484987 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.487473 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.487618 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.487645 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-trsjf" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.514782 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl"] Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.655661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-webhook-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.655755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8dh\" (UniqueName: \"kubernetes.io/projected/3d412b43-90c3-4c5e-9967-22fabd2055b4-kube-api-access-cb8dh\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.655778 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-apiservice-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.757450 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-webhook-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.757556 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8dh\" (UniqueName: \"kubernetes.io/projected/3d412b43-90c3-4c5e-9967-22fabd2055b4-kube-api-access-cb8dh\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.757584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-apiservice-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.763471 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-apiservice-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:08 crc kubenswrapper[4831]: I0309 16:12:08.763657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d412b43-90c3-4c5e-9967-22fabd2055b4-webhook-cert\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.417783 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 16:12:09 crc kubenswrapper[4831]: E0309 16:12:09.454207 4831 secret.go:188] Couldn't get secret metallb-system/metallb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 16:12:09 crc kubenswrapper[4831]: E0309 16:12:09.454323 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert podName:49b4858c-b806-4899-9332-1a23e118cf9e nodeName:}" failed. No retries permitted until 2026-03-09 16:12:09.954297654 +0000 UTC m=+857.087980077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert") pod "metallb-operator-controller-manager-5c78bb468c-hspmr" (UID: "49b4858c-b806-4899-9332-1a23e118cf9e") : failed to sync secret cache: timed out waiting for the condition Mar 09 16:12:09 crc kubenswrapper[4831]: E0309 16:12:09.454232 4831 secret.go:188] Couldn't get secret metallb-system/metallb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 16:12:09 crc kubenswrapper[4831]: E0309 16:12:09.454430 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert podName:49b4858c-b806-4899-9332-1a23e118cf9e nodeName:}" failed. No retries permitted until 2026-03-09 16:12:09.954416177 +0000 UTC m=+857.088098600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert") pod "metallb-operator-controller-manager-5c78bb468c-hspmr" (UID: "49b4858c-b806-4899-9332-1a23e118cf9e") : failed to sync secret cache: timed out waiting for the condition Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.469108 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dnw69" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.506540 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.511230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8dh\" (UniqueName: \"kubernetes.io/projected/3d412b43-90c3-4c5e-9967-22fabd2055b4-kube-api-access-cb8dh\") pod \"metallb-operator-webhook-server-7bb99c7556-ztbpl\" (UID: \"3d412b43-90c3-4c5e-9967-22fabd2055b4\") " pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.513780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bc57\" (UniqueName: \"kubernetes.io/projected/49b4858c-b806-4899-9332-1a23e118cf9e-kube-api-access-2bc57\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.579071 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.699308 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.720328 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.910433 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl"] Mar 09 16:12:09 crc kubenswrapper[4831]: W0309 16:12:09.917764 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d412b43_90c3_4c5e_9967_22fabd2055b4.slice/crio-e334d8fa7efe9dee98b17e821400945825213a994b8c0b916e4e68355a5329dc WatchSource:0}: Error finding container e334d8fa7efe9dee98b17e821400945825213a994b8c0b916e4e68355a5329dc: Status 404 returned error can't find the container with id e334d8fa7efe9dee98b17e821400945825213a994b8c0b916e4e68355a5329dc Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.969513 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.969571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.974961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-webhook-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:09 crc kubenswrapper[4831]: I0309 16:12:09.975195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b4858c-b806-4899-9332-1a23e118cf9e-apiservice-cert\") pod \"metallb-operator-controller-manager-5c78bb468c-hspmr\" (UID: \"49b4858c-b806-4899-9332-1a23e118cf9e\") " pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:10 crc kubenswrapper[4831]: I0309 16:12:10.122087 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:10 crc kubenswrapper[4831]: I0309 16:12:10.349351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" event={"ID":"3d412b43-90c3-4c5e-9967-22fabd2055b4","Type":"ContainerStarted","Data":"e334d8fa7efe9dee98b17e821400945825213a994b8c0b916e4e68355a5329dc"} Mar 09 16:12:10 crc kubenswrapper[4831]: I0309 16:12:10.362690 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr"] Mar 09 16:12:10 crc kubenswrapper[4831]: W0309 16:12:10.373565 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49b4858c_b806_4899_9332_1a23e118cf9e.slice/crio-78f5eb28b2d475ee7c75bea1925f6553cebe1256796892bf6edb8d4db8dff7ec WatchSource:0}: Error finding container 78f5eb28b2d475ee7c75bea1925f6553cebe1256796892bf6edb8d4db8dff7ec: Status 404 returned error can't find the container with id 78f5eb28b2d475ee7c75bea1925f6553cebe1256796892bf6edb8d4db8dff7ec Mar 09 16:12:11 crc kubenswrapper[4831]: I0309 16:12:11.355432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" event={"ID":"49b4858c-b806-4899-9332-1a23e118cf9e","Type":"ContainerStarted","Data":"78f5eb28b2d475ee7c75bea1925f6553cebe1256796892bf6edb8d4db8dff7ec"} Mar 09 16:12:12 crc kubenswrapper[4831]: I0309 16:12:12.468436 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.387033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" event={"ID":"3d412b43-90c3-4c5e-9967-22fabd2055b4","Type":"ContainerStarted","Data":"0ac256d59ce2e46da2b62d9c7ae565d3d5c66c8e625a58a3e07d5ce38d944163"} Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.387487 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.388741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" event={"ID":"49b4858c-b806-4899-9332-1a23e118cf9e","Type":"ContainerStarted","Data":"d1db1395047c6974f464acf39867b502abcac72ed8db23af3ba35c493d2d9c7b"} Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.388950 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.404782 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" podStartSLOduration=2.872369907 podStartE2EDuration="8.404769056s" podCreationTimestamp="2026-03-09 16:12:08 +0000 UTC" firstStartedPulling="2026-03-09 16:12:09.921054578 +0000 UTC m=+857.054737001" lastFinishedPulling="2026-03-09 16:12:15.453453727 +0000 UTC m=+862.587136150" observedRunningTime="2026-03-09 16:12:16.402550822 +0000 UTC m=+863.536233245" watchObservedRunningTime="2026-03-09 16:12:16.404769056 +0000 UTC m=+863.538451479" Mar 09 16:12:16 crc kubenswrapper[4831]: I0309 16:12:16.432221 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" podStartSLOduration=3.385537674 podStartE2EDuration="8.432198723s" podCreationTimestamp="2026-03-09 16:12:08 +0000 UTC" firstStartedPulling="2026-03-09 16:12:10.377818966 +0000 UTC m=+857.511501389" lastFinishedPulling="2026-03-09 16:12:15.424480015 +0000 UTC m=+862.558162438" observedRunningTime="2026-03-09 16:12:16.426899261 +0000 UTC m=+863.560581714" watchObservedRunningTime="2026-03-09 16:12:16.432198723 +0000 UTC m=+863.565881166" Mar 09 16:12:29 crc kubenswrapper[4831]: I0309 16:12:29.705259 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bb99c7556-ztbpl" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.124671 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c78bb468c-hspmr" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.839212 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-czs2j"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.842239 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.844453 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.844546 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rpx8l" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.845904 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.849867 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.850479 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.853168 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.864944 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.945852 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jm6qg"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.946907 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jm6qg" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.949528 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.949531 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.949670 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-npm7l" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.949809 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.962438 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-lnrcv"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.963249 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.964862 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.975235 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lnrcv"] Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988226 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-reloader\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb7p\" (UniqueName: \"kubernetes.io/projected/12da408b-c4d6-4196-bb46-9e9c741b0819-kube-api-access-pvb7p\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988343 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics-certs\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988537 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-conf\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988613 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-startup\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6fgl\" (UniqueName: \"kubernetes.io/projected/dc627a54-0b63-4b4a-a9fc-db74921f2a63-kube-api-access-z6fgl\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-sockets\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:50 crc kubenswrapper[4831]: I0309 16:12:50.988722 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-startup\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-metrics-certs\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090312 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpswq\" (UniqueName: \"kubernetes.io/projected/68cf559a-a36c-4fae-9e8c-7130a85dd894-kube-api-access-cpswq\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6fgl\" (UniqueName: \"kubernetes.io/projected/dc627a54-0b63-4b4a-a9fc-db74921f2a63-kube-api-access-z6fgl\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-sockets\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090381 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ae44691-87ac-4a4e-83c9-8f11dce70777-metallb-excludel2\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090450 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-cert\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-metrics-certs\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-reloader\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: E0309 16:12:51.090516 4831 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb7p\" (UniqueName: \"kubernetes.io/projected/12da408b-c4d6-4196-bb46-9e9c741b0819-kube-api-access-pvb7p\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090588 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics-certs\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: E0309 16:12:51.090632 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert podName:12da408b-c4d6-4196-bb46-9e9c741b0819 nodeName:}" failed. No retries permitted until 2026-03-09 16:12:51.590612014 +0000 UTC m=+898.724294557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert") pod "frr-k8s-webhook-server-7f989f654f-pdcxr" (UID: "12da408b-c4d6-4196-bb46-9e9c741b0819") : secret "frr-k8s-webhook-server-cert" not found Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.090654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45gk\" (UniqueName: \"kubernetes.io/projected/2ae44691-87ac-4a4e-83c9-8f11dce70777-kube-api-access-d45gk\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091096 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-sockets\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-reloader\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-conf\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091445 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091691 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-conf\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.091897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dc627a54-0b63-4b4a-a9fc-db74921f2a63-frr-startup\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.100151 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc627a54-0b63-4b4a-a9fc-db74921f2a63-metrics-certs\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.106025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb7p\" (UniqueName: \"kubernetes.io/projected/12da408b-c4d6-4196-bb46-9e9c741b0819-kube-api-access-pvb7p\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.115966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6fgl\" (UniqueName: \"kubernetes.io/projected/dc627a54-0b63-4b4a-a9fc-db74921f2a63-kube-api-access-z6fgl\") pod \"frr-k8s-czs2j\" (UID: \"dc627a54-0b63-4b4a-a9fc-db74921f2a63\") " pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.168604 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d45gk\" (UniqueName: \"kubernetes.io/projected/2ae44691-87ac-4a4e-83c9-8f11dce70777-kube-api-access-d45gk\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192227 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-metrics-certs\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192258 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpswq\" (UniqueName: \"kubernetes.io/projected/68cf559a-a36c-4fae-9e8c-7130a85dd894-kube-api-access-cpswq\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ae44691-87ac-4a4e-83c9-8f11dce70777-metallb-excludel2\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192324 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-cert\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.192340 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-metrics-certs\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: E0309 16:12:51.192672 4831 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 16:12:51 crc kubenswrapper[4831]: E0309 16:12:51.192725 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist podName:2ae44691-87ac-4a4e-83c9-8f11dce70777 nodeName:}" failed. No retries permitted until 2026-03-09 16:12:51.692711526 +0000 UTC m=+898.826393949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist") pod "speaker-jm6qg" (UID: "2ae44691-87ac-4a4e-83c9-8f11dce70777") : secret "metallb-memberlist" not found Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.194279 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ae44691-87ac-4a4e-83c9-8f11dce70777-metallb-excludel2\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.195709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-metrics-certs\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.196070 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-metrics-certs\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.196149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68cf559a-a36c-4fae-9e8c-7130a85dd894-cert\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.207659 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpswq\" (UniqueName: \"kubernetes.io/projected/68cf559a-a36c-4fae-9e8c-7130a85dd894-kube-api-access-cpswq\") pod \"controller-86ddb6bd46-lnrcv\" (UID: \"68cf559a-a36c-4fae-9e8c-7130a85dd894\") " pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.211048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45gk\" (UniqueName: \"kubernetes.io/projected/2ae44691-87ac-4a4e-83c9-8f11dce70777-kube-api-access-d45gk\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.278129 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.450373 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lnrcv"] Mar 09 16:12:51 crc kubenswrapper[4831]: W0309 16:12:51.455429 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68cf559a_a36c_4fae_9e8c_7130a85dd894.slice/crio-72b471a92e98be4e5ba4f37797e60b3840bbfc4c550cac885ba701c275503e4e WatchSource:0}: Error finding container 72b471a92e98be4e5ba4f37797e60b3840bbfc4c550cac885ba701c275503e4e: Status 404 returned error can't find the container with id 72b471a92e98be4e5ba4f37797e60b3840bbfc4c550cac885ba701c275503e4e Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.589432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"92262c3e2cdaffe9d8eaa1d0dd906272493d4ad7bf4b3b863e89d7acc8a28c21"} Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.591425 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lnrcv" event={"ID":"68cf559a-a36c-4fae-9e8c-7130a85dd894","Type":"ContainerStarted","Data":"4a7ca483d5143c1d16f900eb1d206d157828c03d72bf8d8cadd0ea4349846c3a"} Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.591465 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lnrcv" event={"ID":"68cf559a-a36c-4fae-9e8c-7130a85dd894","Type":"ContainerStarted","Data":"72b471a92e98be4e5ba4f37797e60b3840bbfc4c550cac885ba701c275503e4e"} Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.597193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.603462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12da408b-c4d6-4196-bb46-9e9c741b0819-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pdcxr\" (UID: \"12da408b-c4d6-4196-bb46-9e9c741b0819\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.704765 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.707809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ae44691-87ac-4a4e-83c9-8f11dce70777-memberlist\") pod \"speaker-jm6qg\" (UID: \"2ae44691-87ac-4a4e-83c9-8f11dce70777\") " pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.781921 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:51 crc kubenswrapper[4831]: I0309 16:12:51.861985 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jm6qg" Mar 09 16:12:51 crc kubenswrapper[4831]: W0309 16:12:51.903422 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae44691_87ac_4a4e_83c9_8f11dce70777.slice/crio-ac5f0dc8aa123d694e28e51d3485f2c28046e222695fbbb8b9d22384ec8b41de WatchSource:0}: Error finding container ac5f0dc8aa123d694e28e51d3485f2c28046e222695fbbb8b9d22384ec8b41de: Status 404 returned error can't find the container with id ac5f0dc8aa123d694e28e51d3485f2c28046e222695fbbb8b9d22384ec8b41de Mar 09 16:12:52 crc kubenswrapper[4831]: I0309 16:12:52.011309 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr"] Mar 09 16:12:52 crc kubenswrapper[4831]: W0309 16:12:52.018120 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12da408b_c4d6_4196_bb46_9e9c741b0819.slice/crio-ce6b173941e41fc1d1ee15cfe5e72d312ebcaab2be5c5be4a1996d6e8d024041 WatchSource:0}: Error finding container ce6b173941e41fc1d1ee15cfe5e72d312ebcaab2be5c5be4a1996d6e8d024041: Status 404 returned error can't find the container with id ce6b173941e41fc1d1ee15cfe5e72d312ebcaab2be5c5be4a1996d6e8d024041 Mar 09 16:12:52 crc kubenswrapper[4831]: I0309 16:12:52.598039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" event={"ID":"12da408b-c4d6-4196-bb46-9e9c741b0819","Type":"ContainerStarted","Data":"ce6b173941e41fc1d1ee15cfe5e72d312ebcaab2be5c5be4a1996d6e8d024041"} Mar 09 16:12:52 crc kubenswrapper[4831]: I0309 16:12:52.600182 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jm6qg" event={"ID":"2ae44691-87ac-4a4e-83c9-8f11dce70777","Type":"ContainerStarted","Data":"c9d9e89c6475ce531e73c23e1c60b7f151470af7ef582395bfe8849c5254b9db"} Mar 09 16:12:52 crc kubenswrapper[4831]: I0309 16:12:52.600210 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jm6qg" event={"ID":"2ae44691-87ac-4a4e-83c9-8f11dce70777","Type":"ContainerStarted","Data":"ac5f0dc8aa123d694e28e51d3485f2c28046e222695fbbb8b9d22384ec8b41de"} Mar 09 16:12:54 crc kubenswrapper[4831]: I0309 16:12:54.155598 4831 scope.go:117] "RemoveContainer" containerID="3b7dee95c3c4555ef23055b340f42c0b5e78fd5d43d205d30e80386f667f8162" Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.643047 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jm6qg" event={"ID":"2ae44691-87ac-4a4e-83c9-8f11dce70777","Type":"ContainerStarted","Data":"1872865e68faddc9c494ad998c7f8c7a0bb3f0d802bb8617dbc1a74e1984f00f"} Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.643370 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jm6qg" Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.645391 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lnrcv" event={"ID":"68cf559a-a36c-4fae-9e8c-7130a85dd894","Type":"ContainerStarted","Data":"4307bdc9294518a70bb957268b8e6375c7ced9e6b854d77918c43ab28768755e"} Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.645601 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.660640 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jm6qg" podStartSLOduration=3.182740485 podStartE2EDuration="6.660625161s" podCreationTimestamp="2026-03-09 16:12:50 +0000 UTC" firstStartedPulling="2026-03-09 16:12:52.157626246 +0000 UTC m=+899.291308669" lastFinishedPulling="2026-03-09 16:12:55.635510922 +0000 UTC m=+902.769193345" observedRunningTime="2026-03-09 16:12:56.658383367 +0000 UTC m=+903.792065800" watchObservedRunningTime="2026-03-09 16:12:56.660625161 +0000 UTC m=+903.794307584" Mar 09 16:12:56 crc kubenswrapper[4831]: I0309 16:12:56.679556 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-lnrcv" podStartSLOduration=2.520355794 podStartE2EDuration="6.679541615s" podCreationTimestamp="2026-03-09 16:12:50 +0000 UTC" firstStartedPulling="2026-03-09 16:12:51.57970351 +0000 UTC m=+898.713385933" lastFinishedPulling="2026-03-09 16:12:55.738889331 +0000 UTC m=+902.872571754" observedRunningTime="2026-03-09 16:12:56.675338744 +0000 UTC m=+903.809021177" watchObservedRunningTime="2026-03-09 16:12:56.679541615 +0000 UTC m=+903.813224038" Mar 09 16:12:59 crc kubenswrapper[4831]: I0309 16:12:59.666777 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" event={"ID":"12da408b-c4d6-4196-bb46-9e9c741b0819","Type":"ContainerStarted","Data":"290b08c6e5abd2bdc083cf647b6d6b98c0245673d410e96a4d9fad130220a3b8"} Mar 09 16:12:59 crc kubenswrapper[4831]: I0309 16:12:59.671221 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:12:59 crc kubenswrapper[4831]: I0309 16:12:59.669973 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc627a54-0b63-4b4a-a9fc-db74921f2a63" containerID="89849f0834a8bc668a14e294d5f43afd3ff91339a296b3df61dad22026d7151b" exitCode=0 Mar 09 16:12:59 crc kubenswrapper[4831]: I0309 16:12:59.671268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerDied","Data":"89849f0834a8bc668a14e294d5f43afd3ff91339a296b3df61dad22026d7151b"} Mar 09 16:12:59 crc kubenswrapper[4831]: I0309 16:12:59.695730 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" podStartSLOduration=3.134967532 podStartE2EDuration="9.695706632s" podCreationTimestamp="2026-03-09 16:12:50 +0000 UTC" firstStartedPulling="2026-03-09 16:12:52.020172888 +0000 UTC m=+899.153855311" lastFinishedPulling="2026-03-09 16:12:58.580911988 +0000 UTC m=+905.714594411" observedRunningTime="2026-03-09 16:12:59.688604868 +0000 UTC m=+906.822287361" watchObservedRunningTime="2026-03-09 16:12:59.695706632 +0000 UTC m=+906.829389085" Mar 09 16:13:00 crc kubenswrapper[4831]: I0309 16:13:00.679434 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc627a54-0b63-4b4a-a9fc-db74921f2a63" containerID="917b6e5cb7d0bacda32957b2b826338ac876bec7f8129aa53790d14cd02834bf" exitCode=0 Mar 09 16:13:00 crc kubenswrapper[4831]: I0309 16:13:00.679519 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerDied","Data":"917b6e5cb7d0bacda32957b2b826338ac876bec7f8129aa53790d14cd02834bf"} Mar 09 16:13:01 crc kubenswrapper[4831]: I0309 16:13:01.284521 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-lnrcv" Mar 09 16:13:01 crc kubenswrapper[4831]: I0309 16:13:01.689326 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc627a54-0b63-4b4a-a9fc-db74921f2a63" containerID="6125e12038573c052756dde4e8eddd09db23e3e4034eaedce52c3460e66da107" exitCode=0 Mar 09 16:13:01 crc kubenswrapper[4831]: I0309 16:13:01.689375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerDied","Data":"6125e12038573c052756dde4e8eddd09db23e3e4034eaedce52c3460e66da107"} Mar 09 16:13:02 crc kubenswrapper[4831]: I0309 16:13:02.709934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"ffe6d739d6d1c2fd5b8d0b80c7cb3bce2b8be0941cc6ae2ec6d0c63e58d57997"} Mar 09 16:13:02 crc kubenswrapper[4831]: I0309 16:13:02.710218 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"5176ab0374799ae2a9b9675f446eb8f45550af73b7bc2094367773a4b4b591da"} Mar 09 16:13:02 crc kubenswrapper[4831]: I0309 16:13:02.710228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"cb9d0bb75bfce8a93a724315d4ad3bcdfd43b266dcfa2132d19d1126252e75f7"} Mar 09 16:13:02 crc kubenswrapper[4831]: I0309 16:13:02.710236 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"67cf56a6e190fd658866b09efe8bb1cd73854a7d8178d80c2df84153950bc5f9"} Mar 09 16:13:02 crc kubenswrapper[4831]: I0309 16:13:02.710244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"4fbb92d61f9f92c7548aae4cb0f21fad25cdd316111227d269de8d51ccec8fc1"} Mar 09 16:13:03 crc kubenswrapper[4831]: I0309 16:13:03.722447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-czs2j" event={"ID":"dc627a54-0b63-4b4a-a9fc-db74921f2a63","Type":"ContainerStarted","Data":"a08e7d7b28b076ba9345ff2550119bd57ebae441fde0e734c7b6ca30b9debdf6"} Mar 09 16:13:03 crc kubenswrapper[4831]: I0309 16:13:03.722947 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:13:03 crc kubenswrapper[4831]: I0309 16:13:03.756175 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-czs2j" podStartSLOduration=6.513744554 podStartE2EDuration="13.756157799s" podCreationTimestamp="2026-03-09 16:12:50 +0000 UTC" firstStartedPulling="2026-03-09 16:12:51.333585262 +0000 UTC m=+898.467267675" lastFinishedPulling="2026-03-09 16:12:58.575998487 +0000 UTC m=+905.709680920" observedRunningTime="2026-03-09 16:13:03.750765394 +0000 UTC m=+910.884447837" watchObservedRunningTime="2026-03-09 16:13:03.756157799 +0000 UTC m=+910.889840232" Mar 09 16:13:06 crc kubenswrapper[4831]: I0309 16:13:06.169843 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:13:06 crc kubenswrapper[4831]: I0309 16:13:06.241256 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:13:11 crc kubenswrapper[4831]: I0309 16:13:11.176125 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-czs2j" Mar 09 16:13:11 crc kubenswrapper[4831]: I0309 16:13:11.792285 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pdcxr" Mar 09 16:13:11 crc kubenswrapper[4831]: I0309 16:13:11.875225 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jm6qg" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.557783 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.559283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.562274 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-n7vfv" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.562758 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.563268 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.572607 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.684581 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6569\" (UniqueName: \"kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569\") pod \"mariadb-operator-index-4cj5w\" (UID: \"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05\") " pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.785891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6569\" (UniqueName: \"kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569\") pod \"mariadb-operator-index-4cj5w\" (UID: \"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05\") " pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.808254 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6569\" (UniqueName: \"kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569\") pod \"mariadb-operator-index-4cj5w\" (UID: \"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05\") " pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:17 crc kubenswrapper[4831]: I0309 16:13:17.944520 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:18 crc kubenswrapper[4831]: I0309 16:13:18.152986 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:18 crc kubenswrapper[4831]: W0309 16:13:18.157968 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfc43fc6_0adb_4c1a_af8f_cadddfad8c05.slice/crio-1fd66275be720681ee45a71b45f29d138f4ddc07ee1f0cf216d824dc8d3e6565 WatchSource:0}: Error finding container 1fd66275be720681ee45a71b45f29d138f4ddc07ee1f0cf216d824dc8d3e6565: Status 404 returned error can't find the container with id 1fd66275be720681ee45a71b45f29d138f4ddc07ee1f0cf216d824dc8d3e6565 Mar 09 16:13:18 crc kubenswrapper[4831]: I0309 16:13:18.160155 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:13:18 crc kubenswrapper[4831]: I0309 16:13:18.833009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4cj5w" event={"ID":"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05","Type":"ContainerStarted","Data":"1fd66275be720681ee45a71b45f29d138f4ddc07ee1f0cf216d824dc8d3e6565"} Mar 09 16:13:19 crc kubenswrapper[4831]: I0309 16:13:19.839892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4cj5w" event={"ID":"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05","Type":"ContainerStarted","Data":"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e"} Mar 09 16:13:20 crc kubenswrapper[4831]: I0309 16:13:20.937012 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-4cj5w" podStartSLOduration=3.059156303 podStartE2EDuration="3.936988083s" podCreationTimestamp="2026-03-09 16:13:17 +0000 UTC" firstStartedPulling="2026-03-09 16:13:18.159899651 +0000 UTC m=+925.293582074" lastFinishedPulling="2026-03-09 16:13:19.037731401 +0000 UTC m=+926.171413854" observedRunningTime="2026-03-09 16:13:19.857938375 +0000 UTC m=+926.991620798" watchObservedRunningTime="2026-03-09 16:13:20.936988083 +0000 UTC m=+928.070670536" Mar 09 16:13:20 crc kubenswrapper[4831]: I0309 16:13:20.941616 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.543337 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-ktplw"] Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.543995 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.549748 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-ktplw"] Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.646567 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td9rq\" (UniqueName: \"kubernetes.io/projected/04284f29-23b1-41a2-b851-6391b29c4cb4-kube-api-access-td9rq\") pod \"mariadb-operator-index-ktplw\" (UID: \"04284f29-23b1-41a2-b851-6391b29c4cb4\") " pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.748163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td9rq\" (UniqueName: \"kubernetes.io/projected/04284f29-23b1-41a2-b851-6391b29c4cb4-kube-api-access-td9rq\") pod \"mariadb-operator-index-ktplw\" (UID: \"04284f29-23b1-41a2-b851-6391b29c4cb4\") " pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.781704 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td9rq\" (UniqueName: \"kubernetes.io/projected/04284f29-23b1-41a2-b851-6391b29c4cb4-kube-api-access-td9rq\") pod \"mariadb-operator-index-ktplw\" (UID: \"04284f29-23b1-41a2-b851-6391b29c4cb4\") " pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.860828 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-4cj5w" podUID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" containerName="registry-server" containerID="cri-o://a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e" gracePeriod=2 Mar 09 16:13:21 crc kubenswrapper[4831]: I0309 16:13:21.864760 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.260835 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.327844 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-ktplw"] Mar 09 16:13:22 crc kubenswrapper[4831]: W0309 16:13:22.331201 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04284f29_23b1_41a2_b851_6391b29c4cb4.slice/crio-ed2c37f606e60bd582769a59dd64bdc36f923014975a0c4e068ac22041ed0109 WatchSource:0}: Error finding container ed2c37f606e60bd582769a59dd64bdc36f923014975a0c4e068ac22041ed0109: Status 404 returned error can't find the container with id ed2c37f606e60bd582769a59dd64bdc36f923014975a0c4e068ac22041ed0109 Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.364719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6569\" (UniqueName: \"kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569\") pod \"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05\" (UID: \"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05\") " Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.370142 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569" (OuterVolumeSpecName: "kube-api-access-q6569") pod "cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" (UID: "cfc43fc6-0adb-4c1a-af8f-cadddfad8c05"). InnerVolumeSpecName "kube-api-access-q6569". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.466681 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6569\" (UniqueName: \"kubernetes.io/projected/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05-kube-api-access-q6569\") on node \"crc\" DevicePath \"\"" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.870472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-ktplw" event={"ID":"04284f29-23b1-41a2-b851-6391b29c4cb4","Type":"ContainerStarted","Data":"ed2c37f606e60bd582769a59dd64bdc36f923014975a0c4e068ac22041ed0109"} Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.872980 4831 generic.go:334] "Generic (PLEG): container finished" podID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" containerID="a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e" exitCode=0 Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.873045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4cj5w" event={"ID":"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05","Type":"ContainerDied","Data":"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e"} Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.873185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4cj5w" event={"ID":"cfc43fc6-0adb-4c1a-af8f-cadddfad8c05","Type":"ContainerDied","Data":"1fd66275be720681ee45a71b45f29d138f4ddc07ee1f0cf216d824dc8d3e6565"} Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.873062 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4cj5w" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.873211 4831 scope.go:117] "RemoveContainer" containerID="a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.906215 4831 scope.go:117] "RemoveContainer" containerID="a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e" Mar 09 16:13:22 crc kubenswrapper[4831]: E0309 16:13:22.906887 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e\": container with ID starting with a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e not found: ID does not exist" containerID="a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.907010 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e"} err="failed to get container status \"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e\": rpc error: code = NotFound desc = could not find container \"a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e\": container with ID starting with a20f9afe6c9399760e8bc79682a2033e36446e4c6d3ed4b844ed2cdcd718e47e not found: ID does not exist" Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.919176 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:22 crc kubenswrapper[4831]: I0309 16:13:22.928029 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-4cj5w"] Mar 09 16:13:23 crc kubenswrapper[4831]: I0309 16:13:23.627423 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" path="/var/lib/kubelet/pods/cfc43fc6-0adb-4c1a-af8f-cadddfad8c05/volumes" Mar 09 16:13:23 crc kubenswrapper[4831]: I0309 16:13:23.889814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-ktplw" event={"ID":"04284f29-23b1-41a2-b851-6391b29c4cb4","Type":"ContainerStarted","Data":"34cf9cc3620918d082e2b0f753379072725bbd01efa7b32d77d80cf55e5fc2e8"} Mar 09 16:13:23 crc kubenswrapper[4831]: I0309 16:13:23.910206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-ktplw" podStartSLOduration=2.357386941 podStartE2EDuration="2.910173456s" podCreationTimestamp="2026-03-09 16:13:21 +0000 UTC" firstStartedPulling="2026-03-09 16:13:22.336281457 +0000 UTC m=+929.469963910" lastFinishedPulling="2026-03-09 16:13:22.889067992 +0000 UTC m=+930.022750425" observedRunningTime="2026-03-09 16:13:23.907324514 +0000 UTC m=+931.041007027" watchObservedRunningTime="2026-03-09 16:13:23.910173456 +0000 UTC m=+931.043855959" Mar 09 16:13:31 crc kubenswrapper[4831]: I0309 16:13:31.865760 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:31 crc kubenswrapper[4831]: I0309 16:13:31.866132 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:31 crc kubenswrapper[4831]: I0309 16:13:31.910675 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:31 crc kubenswrapper[4831]: I0309 16:13:31.991098 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-ktplw" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.533879 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58"] Mar 09 16:13:38 crc kubenswrapper[4831]: E0309 16:13:38.534860 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" containerName="registry-server" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.534882 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" containerName="registry-server" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.535089 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc43fc6-0adb-4c1a-af8f-cadddfad8c05" containerName="registry-server" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.536489 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.543896 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.547262 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58"] Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.615872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpw4r\" (UniqueName: \"kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.615921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.616007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.717713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpw4r\" (UniqueName: \"kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.717825 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.717901 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.718669 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.718775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.739631 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpw4r\" (UniqueName: \"kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:38 crc kubenswrapper[4831]: I0309 16:13:38.867477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:39 crc kubenswrapper[4831]: I0309 16:13:39.158065 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58"] Mar 09 16:13:40 crc kubenswrapper[4831]: I0309 16:13:40.019510 4831 generic.go:334] "Generic (PLEG): container finished" podID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerID="8fb38f6c13df979e767edde3d571527f2a0a2fd437af7528a50172f6966f7604" exitCode=0 Mar 09 16:13:40 crc kubenswrapper[4831]: I0309 16:13:40.019610 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" event={"ID":"57a06f9e-e898-4c05-a894-76fdcac7f633","Type":"ContainerDied","Data":"8fb38f6c13df979e767edde3d571527f2a0a2fd437af7528a50172f6966f7604"} Mar 09 16:13:40 crc kubenswrapper[4831]: I0309 16:13:40.019840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" event={"ID":"57a06f9e-e898-4c05-a894-76fdcac7f633","Type":"ContainerStarted","Data":"cd04fb6da19269bfb9d9ca16e2491065f5ca141bc805adcf7a165aa83dd80549"} Mar 09 16:13:42 crc kubenswrapper[4831]: I0309 16:13:42.044315 4831 generic.go:334] "Generic (PLEG): container finished" podID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerID="dfbb33fc8ab093f22020ced79e97b330429013989d54a89b91fb11763a8b9a36" exitCode=0 Mar 09 16:13:42 crc kubenswrapper[4831]: I0309 16:13:42.044374 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" event={"ID":"57a06f9e-e898-4c05-a894-76fdcac7f633","Type":"ContainerDied","Data":"dfbb33fc8ab093f22020ced79e97b330429013989d54a89b91fb11763a8b9a36"} Mar 09 16:13:43 crc kubenswrapper[4831]: I0309 16:13:43.054736 4831 generic.go:334] "Generic (PLEG): container finished" podID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerID="78ad80b9b8cdfafaf430fb9bfdbdd34a8e552a2e17fae0a477b6d9d92672e081" exitCode=0 Mar 09 16:13:43 crc kubenswrapper[4831]: I0309 16:13:43.054797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" event={"ID":"57a06f9e-e898-4c05-a894-76fdcac7f633","Type":"ContainerDied","Data":"78ad80b9b8cdfafaf430fb9bfdbdd34a8e552a2e17fae0a477b6d9d92672e081"} Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.361029 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.503222 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util\") pod \"57a06f9e-e898-4c05-a894-76fdcac7f633\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.503474 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle\") pod \"57a06f9e-e898-4c05-a894-76fdcac7f633\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.503519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpw4r\" (UniqueName: \"kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r\") pod \"57a06f9e-e898-4c05-a894-76fdcac7f633\" (UID: \"57a06f9e-e898-4c05-a894-76fdcac7f633\") " Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.506104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle" (OuterVolumeSpecName: "bundle") pod "57a06f9e-e898-4c05-a894-76fdcac7f633" (UID: "57a06f9e-e898-4c05-a894-76fdcac7f633"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.512121 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r" (OuterVolumeSpecName: "kube-api-access-cpw4r") pod "57a06f9e-e898-4c05-a894-76fdcac7f633" (UID: "57a06f9e-e898-4c05-a894-76fdcac7f633"). InnerVolumeSpecName "kube-api-access-cpw4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.532928 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util" (OuterVolumeSpecName: "util") pod "57a06f9e-e898-4c05-a894-76fdcac7f633" (UID: "57a06f9e-e898-4c05-a894-76fdcac7f633"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.605064 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpw4r\" (UniqueName: \"kubernetes.io/projected/57a06f9e-e898-4c05-a894-76fdcac7f633-kube-api-access-cpw4r\") on node \"crc\" DevicePath \"\"" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.605114 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:13:44 crc kubenswrapper[4831]: I0309 16:13:44.605132 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57a06f9e-e898-4c05-a894-76fdcac7f633-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:13:45 crc kubenswrapper[4831]: I0309 16:13:45.073753 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" Mar 09 16:13:45 crc kubenswrapper[4831]: I0309 16:13:45.073693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58" event={"ID":"57a06f9e-e898-4c05-a894-76fdcac7f633","Type":"ContainerDied","Data":"cd04fb6da19269bfb9d9ca16e2491065f5ca141bc805adcf7a165aa83dd80549"} Mar 09 16:13:45 crc kubenswrapper[4831]: I0309 16:13:45.073937 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd04fb6da19269bfb9d9ca16e2491065f5ca141bc805adcf7a165aa83dd80549" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.675277 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69"] Mar 09 16:13:51 crc kubenswrapper[4831]: E0309 16:13:51.675802 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="util" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.675816 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="util" Mar 09 16:13:51 crc kubenswrapper[4831]: E0309 16:13:51.675841 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="pull" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.675851 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="pull" Mar 09 16:13:51 crc kubenswrapper[4831]: E0309 16:13:51.675862 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="extract" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.675870 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="extract" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.676010 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a06f9e-e898-4c05-a894-76fdcac7f633" containerName="extract" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.676474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.678389 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.678808 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.678826 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vl2xv" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.688426 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69"] Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.800825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4tx\" (UniqueName: \"kubernetes.io/projected/04aafaf0-8914-454d-8c30-0d9655615704-kube-api-access-hg4tx\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.801124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-webhook-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.801162 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-apiservice-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.901959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4tx\" (UniqueName: \"kubernetes.io/projected/04aafaf0-8914-454d-8c30-0d9655615704-kube-api-access-hg4tx\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.902036 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-webhook-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.902072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-apiservice-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.909311 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-webhook-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.916381 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04aafaf0-8914-454d-8c30-0d9655615704-apiservice-cert\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.918159 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4tx\" (UniqueName: \"kubernetes.io/projected/04aafaf0-8914-454d-8c30-0d9655615704-kube-api-access-hg4tx\") pod \"mariadb-operator-controller-manager-84c5d77ccf-pvs69\" (UID: \"04aafaf0-8914-454d-8c30-0d9655615704\") " pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:51 crc kubenswrapper[4831]: I0309 16:13:51.993586 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:52 crc kubenswrapper[4831]: I0309 16:13:52.422844 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69"] Mar 09 16:13:53 crc kubenswrapper[4831]: I0309 16:13:53.125868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" event={"ID":"04aafaf0-8914-454d-8c30-0d9655615704","Type":"ContainerStarted","Data":"eb09797b44995f79093bfca4ab3d5c2d633ce5b08a4921d0b41ce4b0d77c7e9a"} Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.156120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" event={"ID":"04aafaf0-8914-454d-8c30-0d9655615704","Type":"ContainerStarted","Data":"c6128c3599bc1beabb952efcc29e08d22f3441569bb57a2dac6e988788c3553d"} Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.156960 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.176188 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" podStartSLOduration=2.014829602 podStartE2EDuration="5.176172389s" podCreationTimestamp="2026-03-09 16:13:51 +0000 UTC" firstStartedPulling="2026-03-09 16:13:52.436134413 +0000 UTC m=+959.569816836" lastFinishedPulling="2026-03-09 16:13:55.5974772 +0000 UTC m=+962.731159623" observedRunningTime="2026-03-09 16:13:56.17414756 +0000 UTC m=+963.307829983" watchObservedRunningTime="2026-03-09 16:13:56.176172389 +0000 UTC m=+963.309854812" Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.908900 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.910549 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:56 crc kubenswrapper[4831]: I0309 16:13:56.919908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.072740 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.072793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.072839 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwss\" (UniqueName: \"kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.173783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.173832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.173882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwss\" (UniqueName: \"kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.174440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.174568 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.205699 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwss\" (UniqueName: \"kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss\") pod \"redhat-marketplace-qrphf\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.225755 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:13:57 crc kubenswrapper[4831]: I0309 16:13:57.672191 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:13:57 crc kubenswrapper[4831]: W0309 16:13:57.681292 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod659c4fc5_edfd_4a01_a53b_1d3051a4d20f.slice/crio-ebc5084fa0927b36a107a38e503edcc79e4b93ef42a8ee01f75b4a89f038c87b WatchSource:0}: Error finding container ebc5084fa0927b36a107a38e503edcc79e4b93ef42a8ee01f75b4a89f038c87b: Status 404 returned error can't find the container with id ebc5084fa0927b36a107a38e503edcc79e4b93ef42a8ee01f75b4a89f038c87b Mar 09 16:13:58 crc kubenswrapper[4831]: I0309 16:13:58.177291 4831 generic.go:334] "Generic (PLEG): container finished" podID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerID="dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012" exitCode=0 Mar 09 16:13:58 crc kubenswrapper[4831]: I0309 16:13:58.177352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerDied","Data":"dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012"} Mar 09 16:13:58 crc kubenswrapper[4831]: I0309 16:13:58.177434 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerStarted","Data":"ebc5084fa0927b36a107a38e503edcc79e4b93ef42a8ee01f75b4a89f038c87b"} Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.129263 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551214-zmwwz"] Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.130708 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.132590 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.132919 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.133120 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.146000 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551214-zmwwz"] Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.189373 4831 generic.go:334] "Generic (PLEG): container finished" podID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerID="94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765" exitCode=0 Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.189432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerDied","Data":"94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765"} Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.212836 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfkg\" (UniqueName: \"kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg\") pod \"auto-csr-approver-29551214-zmwwz\" (UID: \"7153cf3a-f458-4a17-b759-85d90d63d60a\") " pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.314143 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfkg\" (UniqueName: \"kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg\") pod \"auto-csr-approver-29551214-zmwwz\" (UID: \"7153cf3a-f458-4a17-b759-85d90d63d60a\") " pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.337629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfkg\" (UniqueName: \"kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg\") pod \"auto-csr-approver-29551214-zmwwz\" (UID: \"7153cf3a-f458-4a17-b759-85d90d63d60a\") " pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.446210 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:00 crc kubenswrapper[4831]: I0309 16:14:00.710244 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551214-zmwwz"] Mar 09 16:14:01 crc kubenswrapper[4831]: I0309 16:14:01.199288 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerStarted","Data":"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082"} Mar 09 16:14:01 crc kubenswrapper[4831]: I0309 16:14:01.200373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" event={"ID":"7153cf3a-f458-4a17-b759-85d90d63d60a","Type":"ContainerStarted","Data":"c26f8142ecdf0c9faf5f914b0d87bf0f77959420eb4fe7fe9e0a3d4d68e01f06"} Mar 09 16:14:01 crc kubenswrapper[4831]: I0309 16:14:01.223100 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qrphf" podStartSLOduration=2.705123343 podStartE2EDuration="5.223069363s" podCreationTimestamp="2026-03-09 16:13:56 +0000 UTC" firstStartedPulling="2026-03-09 16:13:58.180064075 +0000 UTC m=+965.313746518" lastFinishedPulling="2026-03-09 16:14:00.698010115 +0000 UTC m=+967.831692538" observedRunningTime="2026-03-09 16:14:01.217284037 +0000 UTC m=+968.350966470" watchObservedRunningTime="2026-03-09 16:14:01.223069363 +0000 UTC m=+968.356751826" Mar 09 16:14:01 crc kubenswrapper[4831]: I0309 16:14:01.999152 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-84c5d77ccf-pvs69" Mar 09 16:14:02 crc kubenswrapper[4831]: I0309 16:14:02.209140 4831 generic.go:334] "Generic (PLEG): container finished" podID="7153cf3a-f458-4a17-b759-85d90d63d60a" containerID="626893ce754906cf904b0a44f23eda0f3441541ad49b103aad6af0afe422a259" exitCode=0 Mar 09 16:14:02 crc kubenswrapper[4831]: I0309 16:14:02.209316 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" event={"ID":"7153cf3a-f458-4a17-b759-85d90d63d60a","Type":"ContainerDied","Data":"626893ce754906cf904b0a44f23eda0f3441541ad49b103aad6af0afe422a259"} Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.019389 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.019483 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.484967 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.551591 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfkg\" (UniqueName: \"kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg\") pod \"7153cf3a-f458-4a17-b759-85d90d63d60a\" (UID: \"7153cf3a-f458-4a17-b759-85d90d63d60a\") " Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.558652 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg" (OuterVolumeSpecName: "kube-api-access-jqfkg") pod "7153cf3a-f458-4a17-b759-85d90d63d60a" (UID: "7153cf3a-f458-4a17-b759-85d90d63d60a"). InnerVolumeSpecName "kube-api-access-jqfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:14:03 crc kubenswrapper[4831]: I0309 16:14:03.653473 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfkg\" (UniqueName: \"kubernetes.io/projected/7153cf3a-f458-4a17-b759-85d90d63d60a-kube-api-access-jqfkg\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:04 crc kubenswrapper[4831]: I0309 16:14:04.222237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" event={"ID":"7153cf3a-f458-4a17-b759-85d90d63d60a","Type":"ContainerDied","Data":"c26f8142ecdf0c9faf5f914b0d87bf0f77959420eb4fe7fe9e0a3d4d68e01f06"} Mar 09 16:14:04 crc kubenswrapper[4831]: I0309 16:14:04.222276 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26f8142ecdf0c9faf5f914b0d87bf0f77959420eb4fe7fe9e0a3d4d68e01f06" Mar 09 16:14:04 crc kubenswrapper[4831]: I0309 16:14:04.222303 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551214-zmwwz" Mar 09 16:14:04 crc kubenswrapper[4831]: I0309 16:14:04.532691 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551208-pchsj"] Mar 09 16:14:04 crc kubenswrapper[4831]: I0309 16:14:04.536289 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551208-pchsj"] Mar 09 16:14:05 crc kubenswrapper[4831]: I0309 16:14:05.626173 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b3f171-585c-4127-970e-da778a2f1d83" path="/var/lib/kubelet/pods/e3b3f171-585c-4127-970e-da778a2f1d83/volumes" Mar 09 16:14:07 crc kubenswrapper[4831]: I0309 16:14:07.226101 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:07 crc kubenswrapper[4831]: I0309 16:14:07.226500 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:07 crc kubenswrapper[4831]: I0309 16:14:07.281018 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.299022 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.706578 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:08 crc kubenswrapper[4831]: E0309 16:14:08.706842 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7153cf3a-f458-4a17-b759-85d90d63d60a" containerName="oc" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.706860 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7153cf3a-f458-4a17-b759-85d90d63d60a" containerName="oc" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.706990 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7153cf3a-f458-4a17-b759-85d90d63d60a" containerName="oc" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.707477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.709701 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-qnxh5" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.722093 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.822928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvpt2\" (UniqueName: \"kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2\") pod \"infra-operator-index-snjjr\" (UID: \"42fcc95e-b9e5-4878-85a9-9af4f2f157df\") " pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.924161 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvpt2\" (UniqueName: \"kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2\") pod \"infra-operator-index-snjjr\" (UID: \"42fcc95e-b9e5-4878-85a9-9af4f2f157df\") " pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:08 crc kubenswrapper[4831]: I0309 16:14:08.943952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvpt2\" (UniqueName: \"kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2\") pod \"infra-operator-index-snjjr\" (UID: \"42fcc95e-b9e5-4878-85a9-9af4f2f157df\") " pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:09 crc kubenswrapper[4831]: I0309 16:14:09.038629 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:09 crc kubenswrapper[4831]: I0309 16:14:09.458242 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:10 crc kubenswrapper[4831]: I0309 16:14:10.260545 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-snjjr" event={"ID":"42fcc95e-b9e5-4878-85a9-9af4f2f157df","Type":"ContainerStarted","Data":"5f7806e26b6a6d40f50cb09538d05e1fe612a4a94a9081c84e09bc4d5b24b8e7"} Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.270483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-snjjr" event={"ID":"42fcc95e-b9e5-4878-85a9-9af4f2f157df","Type":"ContainerStarted","Data":"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca"} Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.291992 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-snjjr" podStartSLOduration=2.2924879369999998 podStartE2EDuration="3.291973778s" podCreationTimestamp="2026-03-09 16:14:08 +0000 UTC" firstStartedPulling="2026-03-09 16:14:09.484575304 +0000 UTC m=+976.618257727" lastFinishedPulling="2026-03-09 16:14:10.484061135 +0000 UTC m=+977.617743568" observedRunningTime="2026-03-09 16:14:11.288946331 +0000 UTC m=+978.422628774" watchObservedRunningTime="2026-03-09 16:14:11.291973778 +0000 UTC m=+978.425656211" Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.483189 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.483416 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qrphf" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="registry-server" containerID="cri-o://c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082" gracePeriod=2 Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.852970 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.987000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content\") pod \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.987138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwss\" (UniqueName: \"kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss\") pod \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.987173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities\") pod \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\" (UID: \"659c4fc5-edfd-4a01-a53b-1d3051a4d20f\") " Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.988854 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities" (OuterVolumeSpecName: "utilities") pod "659c4fc5-edfd-4a01-a53b-1d3051a4d20f" (UID: "659c4fc5-edfd-4a01-a53b-1d3051a4d20f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.989646 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:11 crc kubenswrapper[4831]: I0309 16:14:11.992456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss" (OuterVolumeSpecName: "kube-api-access-qbwss") pod "659c4fc5-edfd-4a01-a53b-1d3051a4d20f" (UID: "659c4fc5-edfd-4a01-a53b-1d3051a4d20f"). InnerVolumeSpecName "kube-api-access-qbwss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.011793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "659c4fc5-edfd-4a01-a53b-1d3051a4d20f" (UID: "659c4fc5-edfd-4a01-a53b-1d3051a4d20f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.091016 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.091050 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwss\" (UniqueName: \"kubernetes.io/projected/659c4fc5-edfd-4a01-a53b-1d3051a4d20f-kube-api-access-qbwss\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.280516 4831 generic.go:334] "Generic (PLEG): container finished" podID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerID="c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082" exitCode=0 Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.280580 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrphf" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.281509 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerDied","Data":"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082"} Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.281578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrphf" event={"ID":"659c4fc5-edfd-4a01-a53b-1d3051a4d20f","Type":"ContainerDied","Data":"ebc5084fa0927b36a107a38e503edcc79e4b93ef42a8ee01f75b4a89f038c87b"} Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.281606 4831 scope.go:117] "RemoveContainer" containerID="c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.302765 4831 scope.go:117] "RemoveContainer" containerID="94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.326276 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.329220 4831 scope.go:117] "RemoveContainer" containerID="dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.332178 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrphf"] Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.361986 4831 scope.go:117] "RemoveContainer" containerID="c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082" Mar 09 16:14:12 crc kubenswrapper[4831]: E0309 16:14:12.362387 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082\": container with ID starting with c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082 not found: ID does not exist" containerID="c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.362447 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082"} err="failed to get container status \"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082\": rpc error: code = NotFound desc = could not find container \"c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082\": container with ID starting with c5f15cef16dbb7b4327dff6f9359824459348ef0b75d6eebc22e178c0b096082 not found: ID does not exist" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.362472 4831 scope.go:117] "RemoveContainer" containerID="94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765" Mar 09 16:14:12 crc kubenswrapper[4831]: E0309 16:14:12.362776 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765\": container with ID starting with 94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765 not found: ID does not exist" containerID="94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.362799 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765"} err="failed to get container status \"94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765\": rpc error: code = NotFound desc = could not find container \"94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765\": container with ID starting with 94552138c5ffcee0c11d443320975a9b6961120a9500dd9b06161c6da7155765 not found: ID does not exist" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.362821 4831 scope.go:117] "RemoveContainer" containerID="dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012" Mar 09 16:14:12 crc kubenswrapper[4831]: E0309 16:14:12.363118 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012\": container with ID starting with dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012 not found: ID does not exist" containerID="dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012" Mar 09 16:14:12 crc kubenswrapper[4831]: I0309 16:14:12.363140 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012"} err="failed to get container status \"dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012\": rpc error: code = NotFound desc = could not find container \"dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012\": container with ID starting with dfff2f5188be52d7c0bcfc954423cc634d9dc8b4182caf527b8b918bce314012 not found: ID does not exist" Mar 09 16:14:13 crc kubenswrapper[4831]: I0309 16:14:13.628940 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" path="/var/lib/kubelet/pods/659c4fc5-edfd-4a01-a53b-1d3051a4d20f/volumes" Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.296883 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.297339 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-snjjr" podUID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" containerName="registry-server" containerID="cri-o://2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca" gracePeriod=2 Mar 09 16:14:14 crc kubenswrapper[4831]: E0309 16:14:14.391194 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fcc95e_b9e5_4878_85a9_9af4f2f157df.slice/crio-2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca.scope\": RecentStats: unable to find data in memory cache]" Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.657304 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.725464 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvpt2\" (UniqueName: \"kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2\") pod \"42fcc95e-b9e5-4878-85a9-9af4f2f157df\" (UID: \"42fcc95e-b9e5-4878-85a9-9af4f2f157df\") " Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.730428 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2" (OuterVolumeSpecName: "kube-api-access-rvpt2") pod "42fcc95e-b9e5-4878-85a9-9af4f2f157df" (UID: "42fcc95e-b9e5-4878-85a9-9af4f2f157df"). InnerVolumeSpecName "kube-api-access-rvpt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:14:14 crc kubenswrapper[4831]: I0309 16:14:14.827415 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvpt2\" (UniqueName: \"kubernetes.io/projected/42fcc95e-b9e5-4878-85a9-9af4f2f157df-kube-api-access-rvpt2\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095541 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-jf47m"] Mar 09 16:14:15 crc kubenswrapper[4831]: E0309 16:14:15.095778 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="extract-utilities" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095791 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="extract-utilities" Mar 09 16:14:15 crc kubenswrapper[4831]: E0309 16:14:15.095802 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095807 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: E0309 16:14:15.095824 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: E0309 16:14:15.095841 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="extract-content" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095846 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="extract-content" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095933 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="659c4fc5-edfd-4a01-a53b-1d3051a4d20f" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.095945 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" containerName="registry-server" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.096288 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.104712 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-jf47m"] Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.249110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrcmk\" (UniqueName: \"kubernetes.io/projected/f9b71116-cde3-4cb8-89c5-d24d4e379771-kube-api-access-vrcmk\") pod \"infra-operator-index-jf47m\" (UID: \"f9b71116-cde3-4cb8-89c5-d24d4e379771\") " pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.308482 4831 generic.go:334] "Generic (PLEG): container finished" podID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" containerID="2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca" exitCode=0 Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.308529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-snjjr" event={"ID":"42fcc95e-b9e5-4878-85a9-9af4f2f157df","Type":"ContainerDied","Data":"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca"} Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.308558 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-snjjr" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.308584 4831 scope.go:117] "RemoveContainer" containerID="2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.308568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-snjjr" event={"ID":"42fcc95e-b9e5-4878-85a9-9af4f2f157df","Type":"ContainerDied","Data":"5f7806e26b6a6d40f50cb09538d05e1fe612a4a94a9081c84e09bc4d5b24b8e7"} Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.331954 4831 scope.go:117] "RemoveContainer" containerID="2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca" Mar 09 16:14:15 crc kubenswrapper[4831]: E0309 16:14:15.333115 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca\": container with ID starting with 2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca not found: ID does not exist" containerID="2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.333182 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca"} err="failed to get container status \"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca\": rpc error: code = NotFound desc = could not find container \"2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca\": container with ID starting with 2f4740963d51734622cfddc15b47e8788c7621644298353ac1ed02cd9146cdca not found: ID does not exist" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.346129 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.350661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrcmk\" (UniqueName: \"kubernetes.io/projected/f9b71116-cde3-4cb8-89c5-d24d4e379771-kube-api-access-vrcmk\") pod \"infra-operator-index-jf47m\" (UID: \"f9b71116-cde3-4cb8-89c5-d24d4e379771\") " pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.351042 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-snjjr"] Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.377734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrcmk\" (UniqueName: \"kubernetes.io/projected/f9b71116-cde3-4cb8-89c5-d24d4e379771-kube-api-access-vrcmk\") pod \"infra-operator-index-jf47m\" (UID: \"f9b71116-cde3-4cb8-89c5-d24d4e379771\") " pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.466078 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.626134 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fcc95e-b9e5-4878-85a9-9af4f2f157df" path="/var/lib/kubelet/pods/42fcc95e-b9e5-4878-85a9-9af4f2f157df/volumes" Mar 09 16:14:15 crc kubenswrapper[4831]: I0309 16:14:15.909248 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-jf47m"] Mar 09 16:14:15 crc kubenswrapper[4831]: W0309 16:14:15.917373 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b71116_cde3_4cb8_89c5_d24d4e379771.slice/crio-0e1b61aaf48682f8b7e13f4adf35c50d17c8f7661e6f51466453d69d0c59588d WatchSource:0}: Error finding container 0e1b61aaf48682f8b7e13f4adf35c50d17c8f7661e6f51466453d69d0c59588d: Status 404 returned error can't find the container with id 0e1b61aaf48682f8b7e13f4adf35c50d17c8f7661e6f51466453d69d0c59588d Mar 09 16:14:16 crc kubenswrapper[4831]: I0309 16:14:16.317698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jf47m" event={"ID":"f9b71116-cde3-4cb8-89c5-d24d4e379771","Type":"ContainerStarted","Data":"0e1b61aaf48682f8b7e13f4adf35c50d17c8f7661e6f51466453d69d0c59588d"} Mar 09 16:14:17 crc kubenswrapper[4831]: I0309 16:14:17.331031 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jf47m" event={"ID":"f9b71116-cde3-4cb8-89c5-d24d4e379771","Type":"ContainerStarted","Data":"1655c209ddbc670ebb29ed9882be1b9fc34cb7956d64adf8d2c31a3eba409f24"} Mar 09 16:14:17 crc kubenswrapper[4831]: I0309 16:14:17.350934 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-jf47m" podStartSLOduration=1.891119367 podStartE2EDuration="2.350904193s" podCreationTimestamp="2026-03-09 16:14:15 +0000 UTC" firstStartedPulling="2026-03-09 16:14:15.91924676 +0000 UTC m=+983.052929203" lastFinishedPulling="2026-03-09 16:14:16.379031596 +0000 UTC m=+983.512714029" observedRunningTime="2026-03-09 16:14:17.348645448 +0000 UTC m=+984.482327931" watchObservedRunningTime="2026-03-09 16:14:17.350904193 +0000 UTC m=+984.484586656" Mar 09 16:14:25 crc kubenswrapper[4831]: I0309 16:14:25.467828 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:25 crc kubenswrapper[4831]: I0309 16:14:25.468783 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:25 crc kubenswrapper[4831]: I0309 16:14:25.511095 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:26 crc kubenswrapper[4831]: I0309 16:14:26.437672 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-jf47m" Mar 09 16:14:28 crc kubenswrapper[4831]: I0309 16:14:28.938679 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7"] Mar 09 16:14:28 crc kubenswrapper[4831]: I0309 16:14:28.940200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:28 crc kubenswrapper[4831]: I0309 16:14:28.944496 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:14:28 crc kubenswrapper[4831]: I0309 16:14:28.957887 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7"] Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.053504 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.053673 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.053723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xzwv\" (UniqueName: \"kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.155634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.155781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.155881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xzwv\" (UniqueName: \"kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.156448 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.156484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.187799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xzwv\" (UniqueName: \"kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.261495 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:29 crc kubenswrapper[4831]: I0309 16:14:29.485106 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7"] Mar 09 16:14:30 crc kubenswrapper[4831]: I0309 16:14:30.438544 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerID="9fa5ed83bc3137596134fd48ecbb2375e4af6fe3c774ca2f86cf0d8666324b80" exitCode=0 Mar 09 16:14:30 crc kubenswrapper[4831]: I0309 16:14:30.438617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" event={"ID":"7f3e2918-6395-4f53-8f31-e35c50beb83a","Type":"ContainerDied","Data":"9fa5ed83bc3137596134fd48ecbb2375e4af6fe3c774ca2f86cf0d8666324b80"} Mar 09 16:14:30 crc kubenswrapper[4831]: I0309 16:14:30.438927 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" event={"ID":"7f3e2918-6395-4f53-8f31-e35c50beb83a","Type":"ContainerStarted","Data":"34d015b1159c8023bdf1b74b042ed7c4c9b2241823cdb46134a1543614a51f7f"} Mar 09 16:14:31 crc kubenswrapper[4831]: I0309 16:14:31.450094 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerID="34d92f792ccc232064d421e8ba15b5d547aa5393ce456978c4b8f0cfe8112fa8" exitCode=0 Mar 09 16:14:31 crc kubenswrapper[4831]: I0309 16:14:31.450222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" event={"ID":"7f3e2918-6395-4f53-8f31-e35c50beb83a","Type":"ContainerDied","Data":"34d92f792ccc232064d421e8ba15b5d547aa5393ce456978c4b8f0cfe8112fa8"} Mar 09 16:14:32 crc kubenswrapper[4831]: I0309 16:14:32.458757 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerID="19fcf66af35927c59ee56547e54652d88b82691a00f45c14b618f81bc67ffd31" exitCode=0 Mar 09 16:14:32 crc kubenswrapper[4831]: I0309 16:14:32.458808 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" event={"ID":"7f3e2918-6395-4f53-8f31-e35c50beb83a","Type":"ContainerDied","Data":"19fcf66af35927c59ee56547e54652d88b82691a00f45c14b618f81bc67ffd31"} Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.018598 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.018655 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.733825 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.922335 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util\") pod \"7f3e2918-6395-4f53-8f31-e35c50beb83a\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.922461 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xzwv\" (UniqueName: \"kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv\") pod \"7f3e2918-6395-4f53-8f31-e35c50beb83a\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.922526 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle\") pod \"7f3e2918-6395-4f53-8f31-e35c50beb83a\" (UID: \"7f3e2918-6395-4f53-8f31-e35c50beb83a\") " Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.925747 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle" (OuterVolumeSpecName: "bundle") pod "7f3e2918-6395-4f53-8f31-e35c50beb83a" (UID: "7f3e2918-6395-4f53-8f31-e35c50beb83a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.929093 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv" (OuterVolumeSpecName: "kube-api-access-8xzwv") pod "7f3e2918-6395-4f53-8f31-e35c50beb83a" (UID: "7f3e2918-6395-4f53-8f31-e35c50beb83a"). InnerVolumeSpecName "kube-api-access-8xzwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:14:33 crc kubenswrapper[4831]: I0309 16:14:33.937798 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util" (OuterVolumeSpecName: "util") pod "7f3e2918-6395-4f53-8f31-e35c50beb83a" (UID: "7f3e2918-6395-4f53-8f31-e35c50beb83a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.024663 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xzwv\" (UniqueName: \"kubernetes.io/projected/7f3e2918-6395-4f53-8f31-e35c50beb83a-kube-api-access-8xzwv\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.025150 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.025286 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f3e2918-6395-4f53-8f31-e35c50beb83a-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.479386 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" event={"ID":"7f3e2918-6395-4f53-8f31-e35c50beb83a","Type":"ContainerDied","Data":"34d015b1159c8023bdf1b74b042ed7c4c9b2241823cdb46134a1543614a51f7f"} Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.479492 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34d015b1159c8023bdf1b74b042ed7c4c9b2241823cdb46134a1543614a51f7f" Mar 09 16:14:34 crc kubenswrapper[4831]: I0309 16:14:34.479511 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.030909 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4"] Mar 09 16:14:45 crc kubenswrapper[4831]: E0309 16:14:45.031712 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="pull" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.031731 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="pull" Mar 09 16:14:45 crc kubenswrapper[4831]: E0309 16:14:45.031747 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="extract" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.031755 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="extract" Mar 09 16:14:45 crc kubenswrapper[4831]: E0309 16:14:45.031772 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="util" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.031781 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="util" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.031940 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e2918-6395-4f53-8f31-e35c50beb83a" containerName="extract" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.032423 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.034434 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.034669 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tfzb5" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.043229 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4"] Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.190626 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmcxm\" (UniqueName: \"kubernetes.io/projected/aace6467-2dc4-43d8-ad52-740b182000dd-kube-api-access-wmcxm\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.190681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-webhook-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.190716 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-apiservice-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.292224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-apiservice-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.292341 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmcxm\" (UniqueName: \"kubernetes.io/projected/aace6467-2dc4-43d8-ad52-740b182000dd-kube-api-access-wmcxm\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.292383 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-webhook-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.297619 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-webhook-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.306003 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aace6467-2dc4-43d8-ad52-740b182000dd-apiservice-cert\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.308189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmcxm\" (UniqueName: \"kubernetes.io/projected/aace6467-2dc4-43d8-ad52-740b182000dd-kube-api-access-wmcxm\") pod \"infra-operator-controller-manager-5fc6567686-lm7h4\" (UID: \"aace6467-2dc4-43d8-ad52-740b182000dd\") " pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.385162 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.565055 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4"] Mar 09 16:14:45 crc kubenswrapper[4831]: I0309 16:14:45.980911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" event={"ID":"aace6467-2dc4-43d8-ad52-740b182000dd","Type":"ContainerStarted","Data":"efc125b8dba09bb39225f9b606e0cd7e1acddff117fe8c8ab0a9aea9a27b5269"} Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.445313 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.446726 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.450023 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-config-data" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.450017 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"kube-root-ca.crt" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.450182 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openshift-service-ca.crt" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.450245 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"galera-openstack-dockercfg-pl98c" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.462005 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-scripts" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.467925 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.475911 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.477548 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.485079 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.486100 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.486192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.503496 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524332 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6lnv\" (UniqueName: \"kubernetes.io/projected/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kube-api-access-k6lnv\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524548 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.524575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625446 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625507 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-kolla-config\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625676 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6lnv\" (UniqueName: \"kubernetes.io/projected/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kube-api-access-k6lnv\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625710 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ccv\" (UniqueName: \"kubernetes.io/projected/3b2d162d-7772-4a94-8a79-4f9664637cce-kube-api-access-s9ccv\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-config-data-default\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625749 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625840 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625955 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-operator-scripts\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.625999 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gvn\" (UniqueName: \"kubernetes.io/projected/afb45109-97df-4bd1-80cc-f9374c213039-kube-api-access-s4gvn\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626078 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626103 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/afb45109-97df-4bd1-80cc-f9374c213039-config-data-generated\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.626605 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") device mount path \"/mnt/openstack/pv12\"" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.627156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.628145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.629162 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.645386 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6lnv\" (UniqueName: \"kubernetes.io/projected/1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665-kube-api-access-k6lnv\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.647468 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-operator-scripts\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gvn\" (UniqueName: \"kubernetes.io/projected/afb45109-97df-4bd1-80cc-f9374c213039-kube-api-access-s4gvn\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727908 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/afb45109-97df-4bd1-80cc-f9374c213039-config-data-generated\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727928 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.727969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728007 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-kolla-config\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728044 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728060 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728089 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-config-data-default\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ccv\" (UniqueName: \"kubernetes.io/projected/3b2d162d-7772-4a94-8a79-4f9664637cce-kube-api-access-s9ccv\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728506 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.728521 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") device mount path \"/mnt/openstack/pv06\"" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.729345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-operator-scripts\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.729479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-default\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.729563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-kolla-config\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.730197 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/afb45109-97df-4bd1-80cc-f9374c213039-config-data-generated\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.730570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/afb45109-97df-4bd1-80cc-f9374c213039-config-data-default\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.732644 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2d162d-7772-4a94-8a79-4f9664637cce-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.732925 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-kolla-config\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.733425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2d162d-7772-4a94-8a79-4f9664637cce-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.755390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ccv\" (UniqueName: \"kubernetes.io/projected/3b2d162d-7772-4a94-8a79-4f9664637cce-kube-api-access-s9ccv\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.754179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gvn\" (UniqueName: \"kubernetes.io/projected/afb45109-97df-4bd1-80cc-f9374c213039-kube-api-access-s4gvn\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.776588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-1\" (UID: \"afb45109-97df-4bd1-80cc-f9374c213039\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.780144 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.783147 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"3b2d162d-7772-4a94-8a79-4f9664637cce\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.805116 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:14:47 crc kubenswrapper[4831]: I0309 16:14:47.812355 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:47.999800 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" event={"ID":"aace6467-2dc4-43d8-ad52-740b182000dd","Type":"ContainerStarted","Data":"520f0e9a83fbbb12833051d5b4ad9d959b344d72ab2a2f4a72cadf925ba1a9db"} Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:48.000475 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:48.024952 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" podStartSLOduration=0.847690787 podStartE2EDuration="3.024936541s" podCreationTimestamp="2026-03-09 16:14:45 +0000 UTC" firstStartedPulling="2026-03-09 16:14:45.5750692 +0000 UTC m=+1012.708751623" lastFinishedPulling="2026-03-09 16:14:47.752314954 +0000 UTC m=+1014.885997377" observedRunningTime="2026-03-09 16:14:48.023011206 +0000 UTC m=+1015.156693639" watchObservedRunningTime="2026-03-09 16:14:48.024936541 +0000 UTC m=+1015.158618964" Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:48.236445 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 16:14:48 crc kubenswrapper[4831]: W0309 16:14:48.241149 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c3d4d6d_e4fe_4ad6_84d1_60132bc0d665.slice/crio-3f34ae643363667930982f0760567a02b475078ee44be5a9b7a85af0cce52f79 WatchSource:0}: Error finding container 3f34ae643363667930982f0760567a02b475078ee44be5a9b7a85af0cce52f79: Status 404 returned error can't find the container with id 3f34ae643363667930982f0760567a02b475078ee44be5a9b7a85af0cce52f79 Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:48.269935 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 16:14:48 crc kubenswrapper[4831]: W0309 16:14:48.271896 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb45109_97df_4bd1_80cc_f9374c213039.slice/crio-82fca14567997aabcd89d852b026a372d0ad5bc1249f1f972e75486ce0e0712b WatchSource:0}: Error finding container 82fca14567997aabcd89d852b026a372d0ad5bc1249f1f972e75486ce0e0712b: Status 404 returned error can't find the container with id 82fca14567997aabcd89d852b026a372d0ad5bc1249f1f972e75486ce0e0712b Mar 09 16:14:48 crc kubenswrapper[4831]: I0309 16:14:48.283534 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 16:14:48 crc kubenswrapper[4831]: W0309 16:14:48.288947 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b2d162d_7772_4a94_8a79_4f9664637cce.slice/crio-ec3081c44670a05ab6ad5ea17d0aa8576e3824f62d748f6132e131c5a39730bf WatchSource:0}: Error finding container ec3081c44670a05ab6ad5ea17d0aa8576e3824f62d748f6132e131c5a39730bf: Status 404 returned error can't find the container with id ec3081c44670a05ab6ad5ea17d0aa8576e3824f62d748f6132e131c5a39730bf Mar 09 16:14:49 crc kubenswrapper[4831]: I0309 16:14:49.020582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665","Type":"ContainerStarted","Data":"3f34ae643363667930982f0760567a02b475078ee44be5a9b7a85af0cce52f79"} Mar 09 16:14:49 crc kubenswrapper[4831]: I0309 16:14:49.023211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"3b2d162d-7772-4a94-8a79-4f9664637cce","Type":"ContainerStarted","Data":"ec3081c44670a05ab6ad5ea17d0aa8576e3824f62d748f6132e131c5a39730bf"} Mar 09 16:14:49 crc kubenswrapper[4831]: I0309 16:14:49.025220 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"afb45109-97df-4bd1-80cc-f9374c213039","Type":"ContainerStarted","Data":"82fca14567997aabcd89d852b026a372d0ad5bc1249f1f972e75486ce0e0712b"} Mar 09 16:14:55 crc kubenswrapper[4831]: I0309 16:14:55.389516 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fc6567686-lm7h4" Mar 09 16:14:55 crc kubenswrapper[4831]: I0309 16:14:55.550345 4831 scope.go:117] "RemoveContainer" containerID="878379a99e9fd114576b8468d1a26af8e7f697904ffb0668e0cf5c42c793857d" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.292976 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.294209 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.297746 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"memcached-memcached-dockercfg-hdnz2" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.303570 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.306604 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"memcached-config-data" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.369393 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-kolla-config\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.369484 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2hs\" (UniqueName: \"kubernetes.io/projected/73fd1de5-1028-47e5-b204-e69c0f1cd028-kube-api-access-kt2hs\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.369534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-config-data\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.470923 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2hs\" (UniqueName: \"kubernetes.io/projected/73fd1de5-1028-47e5-b204-e69c0f1cd028-kube-api-access-kt2hs\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.470994 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-config-data\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.471035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-kolla-config\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.471862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-kolla-config\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.471970 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73fd1de5-1028-47e5-b204-e69c0f1cd028-config-data\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.496196 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2hs\" (UniqueName: \"kubernetes.io/projected/73fd1de5-1028-47e5-b204-e69c0f1cd028-kube-api-access-kt2hs\") pod \"memcached-0\" (UID: \"73fd1de5-1028-47e5-b204-e69c0f1cd028\") " pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:56 crc kubenswrapper[4831]: I0309 16:14:56.611082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.460004 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 16:14:58 crc kubenswrapper[4831]: W0309 16:14:58.470787 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73fd1de5_1028_47e5_b204_e69c0f1cd028.slice/crio-2b9f0c38376fdfafc090ada3462bf73f3828ff347ec1579d8c1e7fc2acab75f8 WatchSource:0}: Error finding container 2b9f0c38376fdfafc090ada3462bf73f3828ff347ec1579d8c1e7fc2acab75f8: Status 404 returned error can't find the container with id 2b9f0c38376fdfafc090ada3462bf73f3828ff347ec1579d8c1e7fc2acab75f8 Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.501204 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.501998 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.504123 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-2w5vd" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.504854 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.596173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77fmm\" (UniqueName: \"kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm\") pod \"rabbitmq-cluster-operator-index-46qxh\" (UID: \"0b2fd94e-96bb-4738-b01a-1a6f7ed28057\") " pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.697583 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77fmm\" (UniqueName: \"kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm\") pod \"rabbitmq-cluster-operator-index-46qxh\" (UID: \"0b2fd94e-96bb-4738-b01a-1a6f7ed28057\") " pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.714563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77fmm\" (UniqueName: \"kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm\") pod \"rabbitmq-cluster-operator-index-46qxh\" (UID: \"0b2fd94e-96bb-4738-b01a-1a6f7ed28057\") " pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:14:58 crc kubenswrapper[4831]: I0309 16:14:58.818790 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.014576 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:14:59 crc kubenswrapper[4831]: W0309 16:14:59.022387 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2fd94e_96bb_4738_b01a_1a6f7ed28057.slice/crio-7eb392d3f0e17cdc0732694e85ece0b42a518311b2b67f57ed75c746e7eaf98a WatchSource:0}: Error finding container 7eb392d3f0e17cdc0732694e85ece0b42a518311b2b67f57ed75c746e7eaf98a: Status 404 returned error can't find the container with id 7eb392d3f0e17cdc0732694e85ece0b42a518311b2b67f57ed75c746e7eaf98a Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.089242 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"afb45109-97df-4bd1-80cc-f9374c213039","Type":"ContainerStarted","Data":"68984cb221e0ee7c85886b6069f0a0f8dd65a484871694ce4c4c6fe52ee1f0a0"} Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.090518 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665","Type":"ContainerStarted","Data":"834e9c2afcf98f7cce19bf633ab976666d8c64fa51681cd3bac09e292acdadb7"} Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.094538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"73fd1de5-1028-47e5-b204-e69c0f1cd028","Type":"ContainerStarted","Data":"2b9f0c38376fdfafc090ada3462bf73f3828ff347ec1579d8c1e7fc2acab75f8"} Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.097809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"3b2d162d-7772-4a94-8a79-4f9664637cce","Type":"ContainerStarted","Data":"a9f3baeb3e94c7c9525c8f0e1a41f003b3199436f40fb72934c744cf582a1f91"} Mar 09 16:14:59 crc kubenswrapper[4831]: I0309 16:14:59.100246 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" event={"ID":"0b2fd94e-96bb-4738-b01a-1a6f7ed28057","Type":"ContainerStarted","Data":"7eb392d3f0e17cdc0732694e85ece0b42a518311b2b67f57ed75c746e7eaf98a"} Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.125164 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx"] Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.126374 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.130512 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.131225 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.134217 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx"] Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.219811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.219898 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.220046 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqpq\" (UniqueName: \"kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.321906 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.321970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.322053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqpq\" (UniqueName: \"kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.323355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.329929 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.344100 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqpq\" (UniqueName: \"kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq\") pod \"collect-profiles-29551215-sdmnx\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.459007 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:00 crc kubenswrapper[4831]: I0309 16:15:00.916943 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx"] Mar 09 16:15:01 crc kubenswrapper[4831]: W0309 16:15:01.013598 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3e21af_9eb1_4bcb_9485_3abd38553a4e.slice/crio-6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b WatchSource:0}: Error finding container 6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b: Status 404 returned error can't find the container with id 6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b Mar 09 16:15:01 crc kubenswrapper[4831]: I0309 16:15:01.127127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"73fd1de5-1028-47e5-b204-e69c0f1cd028","Type":"ContainerStarted","Data":"c128fe14b9ebb4e44c74ac5998647b98ee1a346ba2cf5450d1989457fb637a7c"} Mar 09 16:15:01 crc kubenswrapper[4831]: I0309 16:15:01.128076 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/memcached-0" Mar 09 16:15:01 crc kubenswrapper[4831]: I0309 16:15:01.129718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" event={"ID":"ac3e21af-9eb1-4bcb-9485-3abd38553a4e","Type":"ContainerStarted","Data":"6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b"} Mar 09 16:15:01 crc kubenswrapper[4831]: I0309 16:15:01.147377 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/memcached-0" podStartSLOduration=3.093883528 podStartE2EDuration="5.147355096s" podCreationTimestamp="2026-03-09 16:14:56 +0000 UTC" firstStartedPulling="2026-03-09 16:14:58.479867879 +0000 UTC m=+1025.613550302" lastFinishedPulling="2026-03-09 16:15:00.533339447 +0000 UTC m=+1027.667021870" observedRunningTime="2026-03-09 16:15:01.143615368 +0000 UTC m=+1028.277297791" watchObservedRunningTime="2026-03-09 16:15:01.147355096 +0000 UTC m=+1028.281037519" Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.153082 4831 generic.go:334] "Generic (PLEG): container finished" podID="afb45109-97df-4bd1-80cc-f9374c213039" containerID="68984cb221e0ee7c85886b6069f0a0f8dd65a484871694ce4c4c6fe52ee1f0a0" exitCode=0 Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.153170 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"afb45109-97df-4bd1-80cc-f9374c213039","Type":"ContainerDied","Data":"68984cb221e0ee7c85886b6069f0a0f8dd65a484871694ce4c4c6fe52ee1f0a0"} Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.156019 4831 generic.go:334] "Generic (PLEG): container finished" podID="1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665" containerID="834e9c2afcf98f7cce19bf633ab976666d8c64fa51681cd3bac09e292acdadb7" exitCode=0 Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.156090 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665","Type":"ContainerDied","Data":"834e9c2afcf98f7cce19bf633ab976666d8c64fa51681cd3bac09e292acdadb7"} Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.158758 4831 generic.go:334] "Generic (PLEG): container finished" podID="3b2d162d-7772-4a94-8a79-4f9664637cce" containerID="a9f3baeb3e94c7c9525c8f0e1a41f003b3199436f40fb72934c744cf582a1f91" exitCode=0 Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.158842 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"3b2d162d-7772-4a94-8a79-4f9664637cce","Type":"ContainerDied","Data":"a9f3baeb3e94c7c9525c8f0e1a41f003b3199436f40fb72934c744cf582a1f91"} Mar 09 16:15:02 crc kubenswrapper[4831]: I0309 16:15:02.688500 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.018566 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.018926 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.019193 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.019965 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.020069 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab" gracePeriod=600 Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.177327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665","Type":"ContainerStarted","Data":"6cbd296891661217454332b9427cb6ad9289b93b29b18c37963f6cecc9d8ed15"} Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.179341 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab" exitCode=0 Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.179420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab"} Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.179460 4831 scope.go:117] "RemoveContainer" containerID="1c76c6414699eae246a06d1e97818d4928fb333dbba2c6b4163c0c46e43c62ea" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.181588 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" event={"ID":"ac3e21af-9eb1-4bcb-9485-3abd38553a4e","Type":"ContainerStarted","Data":"8a2328530d50b34f5220327bae1fe488b0da842b41bc71e2f7e5fca66e5ab733"} Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.183306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"3b2d162d-7772-4a94-8a79-4f9664637cce","Type":"ContainerStarted","Data":"d2b5b65c71994fa488687963fc1fde3280afb6b894e85cca05ff5573f3f01d0a"} Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.204458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"afb45109-97df-4bd1-80cc-f9374c213039","Type":"ContainerStarted","Data":"276225a2e55bba1cf6e9daa5513d7d08b2b7bf3778f6596b76a2598a69d6a4ac"} Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.232159 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-0" podStartSLOduration=7.201935995 podStartE2EDuration="17.232143294s" podCreationTimestamp="2026-03-09 16:14:46 +0000 UTC" firstStartedPulling="2026-03-09 16:14:48.243062751 +0000 UTC m=+1015.376745174" lastFinishedPulling="2026-03-09 16:14:58.27327005 +0000 UTC m=+1025.406952473" observedRunningTime="2026-03-09 16:15:03.226310296 +0000 UTC m=+1030.359992719" watchObservedRunningTime="2026-03-09 16:15:03.232143294 +0000 UTC m=+1030.365825717" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.279628 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-1" podStartSLOduration=7.215825533 podStartE2EDuration="17.279611268s" podCreationTimestamp="2026-03-09 16:14:46 +0000 UTC" firstStartedPulling="2026-03-09 16:14:48.274273018 +0000 UTC m=+1015.407955441" lastFinishedPulling="2026-03-09 16:14:58.338058753 +0000 UTC m=+1025.471741176" observedRunningTime="2026-03-09 16:15:03.275764338 +0000 UTC m=+1030.409446751" watchObservedRunningTime="2026-03-09 16:15:03.279611268 +0000 UTC m=+1030.413293691" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.306540 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-ltn84"] Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.307429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.326022 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-2" podStartSLOduration=7.328577585 podStartE2EDuration="17.326005882s" podCreationTimestamp="2026-03-09 16:14:46 +0000 UTC" firstStartedPulling="2026-03-09 16:14:48.290502005 +0000 UTC m=+1015.424184428" lastFinishedPulling="2026-03-09 16:14:58.287930312 +0000 UTC m=+1025.421612725" observedRunningTime="2026-03-09 16:15:03.31793181 +0000 UTC m=+1030.451614233" watchObservedRunningTime="2026-03-09 16:15:03.326005882 +0000 UTC m=+1030.459688305" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.338482 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-ltn84"] Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.473137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqpb\" (UniqueName: \"kubernetes.io/projected/da24a64e-1143-493e-a8ea-c944fabd209e-kube-api-access-5wqpb\") pod \"rabbitmq-cluster-operator-index-ltn84\" (UID: \"da24a64e-1143-493e-a8ea-c944fabd209e\") " pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.573980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqpb\" (UniqueName: \"kubernetes.io/projected/da24a64e-1143-493e-a8ea-c944fabd209e-kube-api-access-5wqpb\") pod \"rabbitmq-cluster-operator-index-ltn84\" (UID: \"da24a64e-1143-493e-a8ea-c944fabd209e\") " pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.602234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqpb\" (UniqueName: \"kubernetes.io/projected/da24a64e-1143-493e-a8ea-c944fabd209e-kube-api-access-5wqpb\") pod \"rabbitmq-cluster-operator-index-ltn84\" (UID: \"da24a64e-1143-493e-a8ea-c944fabd209e\") " pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:03 crc kubenswrapper[4831]: I0309 16:15:03.635030 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.212625 4831 generic.go:334] "Generic (PLEG): container finished" podID="ac3e21af-9eb1-4bcb-9485-3abd38553a4e" containerID="8a2328530d50b34f5220327bae1fe488b0da842b41bc71e2f7e5fca66e5ab733" exitCode=0 Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.212680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" event={"ID":"ac3e21af-9eb1-4bcb-9485-3abd38553a4e","Type":"ContainerDied","Data":"8a2328530d50b34f5220327bae1fe488b0da842b41bc71e2f7e5fca66e5ab733"} Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.536067 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.586805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkqpq\" (UniqueName: \"kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq\") pod \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.586910 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume\") pod \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.587022 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume\") pod \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\" (UID: \"ac3e21af-9eb1-4bcb-9485-3abd38553a4e\") " Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.588490 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac3e21af-9eb1-4bcb-9485-3abd38553a4e" (UID: "ac3e21af-9eb1-4bcb-9485-3abd38553a4e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.593835 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac3e21af-9eb1-4bcb-9485-3abd38553a4e" (UID: "ac3e21af-9eb1-4bcb-9485-3abd38553a4e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.598077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq" (OuterVolumeSpecName: "kube-api-access-vkqpq") pod "ac3e21af-9eb1-4bcb-9485-3abd38553a4e" (UID: "ac3e21af-9eb1-4bcb-9485-3abd38553a4e"). InnerVolumeSpecName "kube-api-access-vkqpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.688638 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.688787 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkqpq\" (UniqueName: \"kubernetes.io/projected/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-kube-api-access-vkqpq\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.688804 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac3e21af-9eb1-4bcb-9485-3abd38553a4e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:04 crc kubenswrapper[4831]: I0309 16:15:04.898079 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-ltn84"] Mar 09 16:15:04 crc kubenswrapper[4831]: W0309 16:15:04.903545 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda24a64e_1143_493e_a8ea_c944fabd209e.slice/crio-614ec2ebfcece1625715ae996639cbe18e7bc7ce40f9068e219e291bf7e17d24 WatchSource:0}: Error finding container 614ec2ebfcece1625715ae996639cbe18e7bc7ce40f9068e219e291bf7e17d24: Status 404 returned error can't find the container with id 614ec2ebfcece1625715ae996639cbe18e7bc7ce40f9068e219e291bf7e17d24 Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.219816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" event={"ID":"0b2fd94e-96bb-4738-b01a-1a6f7ed28057","Type":"ContainerStarted","Data":"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d"} Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.219936 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" podUID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" containerName="registry-server" containerID="cri-o://048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d" gracePeriod=2 Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.220956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" event={"ID":"da24a64e-1143-493e-a8ea-c944fabd209e","Type":"ContainerStarted","Data":"614ec2ebfcece1625715ae996639cbe18e7bc7ce40f9068e219e291bf7e17d24"} Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.223054 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c"} Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.229009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" event={"ID":"ac3e21af-9eb1-4bcb-9485-3abd38553a4e","Type":"ContainerDied","Data":"6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b"} Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.229047 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6003704cfba1bbcaf5d37e892a5658ff94e4f7b87fb21ec08ee1306e0c89d39b" Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.229054 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551215-sdmnx" Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.239976 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" podStartSLOduration=1.681546221 podStartE2EDuration="7.239953098s" podCreationTimestamp="2026-03-09 16:14:58 +0000 UTC" firstStartedPulling="2026-03-09 16:14:59.02365097 +0000 UTC m=+1026.157333393" lastFinishedPulling="2026-03-09 16:15:04.582057847 +0000 UTC m=+1031.715740270" observedRunningTime="2026-03-09 16:15:05.239912237 +0000 UTC m=+1032.373594670" watchObservedRunningTime="2026-03-09 16:15:05.239953098 +0000 UTC m=+1032.373635521" Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.656118 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.702940 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77fmm\" (UniqueName: \"kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm\") pod \"0b2fd94e-96bb-4738-b01a-1a6f7ed28057\" (UID: \"0b2fd94e-96bb-4738-b01a-1a6f7ed28057\") " Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.718595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm" (OuterVolumeSpecName: "kube-api-access-77fmm") pod "0b2fd94e-96bb-4738-b01a-1a6f7ed28057" (UID: "0b2fd94e-96bb-4738-b01a-1a6f7ed28057"). InnerVolumeSpecName "kube-api-access-77fmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:05 crc kubenswrapper[4831]: I0309 16:15:05.804830 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77fmm\" (UniqueName: \"kubernetes.io/projected/0b2fd94e-96bb-4738-b01a-1a6f7ed28057-kube-api-access-77fmm\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.237636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" event={"ID":"da24a64e-1143-493e-a8ea-c944fabd209e","Type":"ContainerStarted","Data":"fdeed1bf728670795fab61699c8706cf1e7082ac834064ffdeeeec70a1c49e32"} Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.240856 4831 generic.go:334] "Generic (PLEG): container finished" podID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" containerID="048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d" exitCode=0 Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.240898 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" event={"ID":"0b2fd94e-96bb-4738-b01a-1a6f7ed28057","Type":"ContainerDied","Data":"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d"} Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.240934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" event={"ID":"0b2fd94e-96bb-4738-b01a-1a6f7ed28057","Type":"ContainerDied","Data":"7eb392d3f0e17cdc0732694e85ece0b42a518311b2b67f57ed75c746e7eaf98a"} Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.240951 4831 scope.go:117] "RemoveContainer" containerID="048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.240907 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-46qxh" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.252990 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" podStartSLOduration=2.77225683 podStartE2EDuration="3.252972538s" podCreationTimestamp="2026-03-09 16:15:03 +0000 UTC" firstStartedPulling="2026-03-09 16:15:04.907188113 +0000 UTC m=+1032.040870536" lastFinishedPulling="2026-03-09 16:15:05.387903821 +0000 UTC m=+1032.521586244" observedRunningTime="2026-03-09 16:15:06.250818336 +0000 UTC m=+1033.384500769" watchObservedRunningTime="2026-03-09 16:15:06.252972538 +0000 UTC m=+1033.386654961" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.275695 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.279817 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-46qxh"] Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.280004 4831 scope.go:117] "RemoveContainer" containerID="048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d" Mar 09 16:15:06 crc kubenswrapper[4831]: E0309 16:15:06.281557 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d\": container with ID starting with 048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d not found: ID does not exist" containerID="048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.281600 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d"} err="failed to get container status \"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d\": rpc error: code = NotFound desc = could not find container \"048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d\": container with ID starting with 048f15b81f7bb61411f8a4901629aade8d4a618f058aad731ac9d7939779532d not found: ID does not exist" Mar 09 16:15:06 crc kubenswrapper[4831]: I0309 16:15:06.612738 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/memcached-0" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.630986 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" path="/var/lib/kubelet/pods/0b2fd94e-96bb-4738-b01a-1a6f7ed28057/volumes" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.781333 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.781381 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.806258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.806317 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.812553 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:15:07 crc kubenswrapper[4831]: I0309 16:15:07.813035 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:15:09 crc kubenswrapper[4831]: I0309 16:15:09.891224 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:15:09 crc kubenswrapper[4831]: I0309 16:15:09.960169 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 16:15:10 crc kubenswrapper[4831]: E0309 16:15:10.242048 4831 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.162:40210->38.102.83.162:46465: read tcp 38.102.83.162:40210->38.102.83.162:46465: read: connection reset by peer Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.500169 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:10 crc kubenswrapper[4831]: E0309 16:15:10.500899 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3e21af-9eb1-4bcb-9485-3abd38553a4e" containerName="collect-profiles" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.500926 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3e21af-9eb1-4bcb-9485-3abd38553a4e" containerName="collect-profiles" Mar 09 16:15:10 crc kubenswrapper[4831]: E0309 16:15:10.500953 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" containerName="registry-server" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.500963 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" containerName="registry-server" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.501241 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2fd94e-96bb-4738-b01a-1a6f7ed28057" containerName="registry-server" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.501265 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3e21af-9eb1-4bcb-9485-3abd38553a4e" containerName="collect-profiles" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.502740 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.507271 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.669074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.669129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.669384 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8gx\" (UniqueName: \"kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.770629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.770712 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.770739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8gx\" (UniqueName: \"kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.771739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.772025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.800702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8gx\" (UniqueName: \"kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx\") pod \"community-operators-mgdcd\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:10 crc kubenswrapper[4831]: I0309 16:15:10.834779 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:11 crc kubenswrapper[4831]: I0309 16:15:11.285523 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:12 crc kubenswrapper[4831]: I0309 16:15:12.277642 4831 generic.go:334] "Generic (PLEG): container finished" podID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerID="fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07" exitCode=0 Mar 09 16:15:12 crc kubenswrapper[4831]: I0309 16:15:12.277704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerDied","Data":"fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07"} Mar 09 16:15:12 crc kubenswrapper[4831]: I0309 16:15:12.278184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerStarted","Data":"418a92de0eb48deb4a7d4754e96bf26b27dc17a6566f907abb4ac80793090e17"} Mar 09 16:15:13 crc kubenswrapper[4831]: I0309 16:15:13.285754 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerStarted","Data":"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb"} Mar 09 16:15:13 crc kubenswrapper[4831]: I0309 16:15:13.635601 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:13 crc kubenswrapper[4831]: I0309 16:15:13.635648 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:13 crc kubenswrapper[4831]: I0309 16:15:13.667764 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:14 crc kubenswrapper[4831]: I0309 16:15:14.297121 4831 generic.go:334] "Generic (PLEG): container finished" podID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerID="7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb" exitCode=0 Mar 09 16:15:14 crc kubenswrapper[4831]: I0309 16:15:14.297255 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerDied","Data":"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb"} Mar 09 16:15:14 crc kubenswrapper[4831]: I0309 16:15:14.332943 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-ltn84" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.315572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerStarted","Data":"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9"} Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.339690 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mgdcd" podStartSLOduration=2.498879757 podStartE2EDuration="6.339671671s" podCreationTimestamp="2026-03-09 16:15:10 +0000 UTC" firstStartedPulling="2026-03-09 16:15:12.280057106 +0000 UTC m=+1039.413739529" lastFinishedPulling="2026-03-09 16:15:16.12084902 +0000 UTC m=+1043.254531443" observedRunningTime="2026-03-09 16:15:16.334418019 +0000 UTC m=+1043.468100462" watchObservedRunningTime="2026-03-09 16:15:16.339671671 +0000 UTC m=+1043.473354094" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.481806 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/root-account-create-update-htfm7"] Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.482758 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.484312 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.494117 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-htfm7"] Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.657124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2r6\" (UniqueName: \"kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.657381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.758907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2r6\" (UniqueName: \"kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.758991 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.759909 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.782432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2r6\" (UniqueName: \"kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6\") pod \"root-account-create-update-htfm7\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:16 crc kubenswrapper[4831]: I0309 16:15:16.845090 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.142266 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg"] Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.143912 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.145586 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.150828 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg"] Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.265124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.265175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nc29\" (UniqueName: \"kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.265296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.282410 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-htfm7"] Mar 09 16:15:17 crc kubenswrapper[4831]: W0309 16:15:17.285851 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f34546_4a53_44eb_bac8_59527096a882.slice/crio-c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f WatchSource:0}: Error finding container c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f: Status 404 returned error can't find the container with id c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.321322 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-htfm7" event={"ID":"65f34546-4a53-44eb-bac8-59527096a882","Type":"ContainerStarted","Data":"c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f"} Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.366930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.367055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.367075 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nc29\" (UniqueName: \"kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.367754 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.367823 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.396617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nc29\" (UniqueName: \"kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.460950 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:17 crc kubenswrapper[4831]: I0309 16:15:17.912288 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg"] Mar 09 16:15:18 crc kubenswrapper[4831]: I0309 16:15:18.329043 4831 generic.go:334] "Generic (PLEG): container finished" podID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerID="28322c74b9f74315db3284a736c3ddcc3e1ca8ed2254c54f05fb5a8b52bc9d36" exitCode=0 Mar 09 16:15:18 crc kubenswrapper[4831]: I0309 16:15:18.329327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" event={"ID":"8a3b8608-bf80-4c51-a661-65b0c5056ecc","Type":"ContainerDied","Data":"28322c74b9f74315db3284a736c3ddcc3e1ca8ed2254c54f05fb5a8b52bc9d36"} Mar 09 16:15:18 crc kubenswrapper[4831]: I0309 16:15:18.329353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" event={"ID":"8a3b8608-bf80-4c51-a661-65b0c5056ecc","Type":"ContainerStarted","Data":"7ec008f34af4246f9b71109bb2eb6d97ce1ff672fa00b6f90f9e14f3feb72839"} Mar 09 16:15:18 crc kubenswrapper[4831]: I0309 16:15:18.332170 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-htfm7" event={"ID":"65f34546-4a53-44eb-bac8-59527096a882","Type":"ContainerStarted","Data":"17acb04554823975693117326773e6242f3cd81c478eba691d6f79794557b776"} Mar 09 16:15:18 crc kubenswrapper[4831]: I0309 16:15:18.387110 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/root-account-create-update-htfm7" podStartSLOduration=2.387091855 podStartE2EDuration="2.387091855s" podCreationTimestamp="2026-03-09 16:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:15:18.384431518 +0000 UTC m=+1045.518113941" watchObservedRunningTime="2026-03-09 16:15:18.387091855 +0000 UTC m=+1045.520774278" Mar 09 16:15:19 crc kubenswrapper[4831]: I0309 16:15:19.340354 4831 generic.go:334] "Generic (PLEG): container finished" podID="65f34546-4a53-44eb-bac8-59527096a882" containerID="17acb04554823975693117326773e6242f3cd81c478eba691d6f79794557b776" exitCode=0 Mar 09 16:15:19 crc kubenswrapper[4831]: I0309 16:15:19.340551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-htfm7" event={"ID":"65f34546-4a53-44eb-bac8-59527096a882","Type":"ContainerDied","Data":"17acb04554823975693117326773e6242f3cd81c478eba691d6f79794557b776"} Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.349082 4831 generic.go:334] "Generic (PLEG): container finished" podID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerID="21873ad9673578be77347756f6a58837ce857585891e5a094f13b81266ba898d" exitCode=0 Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.349189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" event={"ID":"8a3b8608-bf80-4c51-a661-65b0c5056ecc","Type":"ContainerDied","Data":"21873ad9673578be77347756f6a58837ce857585891e5a094f13b81266ba898d"} Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.689758 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.815209 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q2r6\" (UniqueName: \"kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6\") pod \"65f34546-4a53-44eb-bac8-59527096a882\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.815708 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts\") pod \"65f34546-4a53-44eb-bac8-59527096a882\" (UID: \"65f34546-4a53-44eb-bac8-59527096a882\") " Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.816540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65f34546-4a53-44eb-bac8-59527096a882" (UID: "65f34546-4a53-44eb-bac8-59527096a882"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.821765 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6" (OuterVolumeSpecName: "kube-api-access-7q2r6") pod "65f34546-4a53-44eb-bac8-59527096a882" (UID: "65f34546-4a53-44eb-bac8-59527096a882"). InnerVolumeSpecName "kube-api-access-7q2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.835890 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.835945 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.887928 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.917270 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q2r6\" (UniqueName: \"kubernetes.io/projected/65f34546-4a53-44eb-bac8-59527096a882-kube-api-access-7q2r6\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:20 crc kubenswrapper[4831]: I0309 16:15:20.917312 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65f34546-4a53-44eb-bac8-59527096a882-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.360280 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-htfm7" event={"ID":"65f34546-4a53-44eb-bac8-59527096a882","Type":"ContainerDied","Data":"c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f"} Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.360325 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c609fb0bc1784f559185bcc6a7bfa147d7d1ca86f8af129213f7d0d01de6f31f" Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.360347 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-htfm7" Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.362268 4831 generic.go:334] "Generic (PLEG): container finished" podID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerID="aa85674707da749a2c294d3346ca28b2d67857e5051dd0704325c566d4c1cf8b" exitCode=0 Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.362312 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" event={"ID":"8a3b8608-bf80-4c51-a661-65b0c5056ecc","Type":"ContainerDied","Data":"aa85674707da749a2c294d3346ca28b2d67857e5051dd0704325c566d4c1cf8b"} Mar 09 16:15:21 crc kubenswrapper[4831]: I0309 16:15:21.402121 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.514520 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.577800 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.703368 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.785135 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.944893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nc29\" (UniqueName: \"kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29\") pod \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.945161 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle\") pod \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.945233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util\") pod \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\" (UID: \"8a3b8608-bf80-4c51-a661-65b0c5056ecc\") " Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.946066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle" (OuterVolumeSpecName: "bundle") pod "8a3b8608-bf80-4c51-a661-65b0c5056ecc" (UID: "8a3b8608-bf80-4c51-a661-65b0c5056ecc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.952572 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29" (OuterVolumeSpecName: "kube-api-access-2nc29") pod "8a3b8608-bf80-4c51-a661-65b0c5056ecc" (UID: "8a3b8608-bf80-4c51-a661-65b0c5056ecc"). InnerVolumeSpecName "kube-api-access-2nc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:22 crc kubenswrapper[4831]: I0309 16:15:22.955595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util" (OuterVolumeSpecName: "util") pod "8a3b8608-bf80-4c51-a661-65b0c5056ecc" (UID: "8a3b8608-bf80-4c51-a661-65b0c5056ecc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.046946 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nc29\" (UniqueName: \"kubernetes.io/projected/8a3b8608-bf80-4c51-a661-65b0c5056ecc-kube-api-access-2nc29\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.047023 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.047037 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a3b8608-bf80-4c51-a661-65b0c5056ecc-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.380054 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" event={"ID":"8a3b8608-bf80-4c51-a661-65b0c5056ecc","Type":"ContainerDied","Data":"7ec008f34af4246f9b71109bb2eb6d97ce1ff672fa00b6f90f9e14f3feb72839"} Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.380102 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec008f34af4246f9b71109bb2eb6d97ce1ff672fa00b6f90f9e14f3feb72839" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.380132 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg" Mar 09 16:15:23 crc kubenswrapper[4831]: I0309 16:15:23.380299 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mgdcd" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="registry-server" containerID="cri-o://b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9" gracePeriod=2 Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.309518 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.387915 4831 generic.go:334] "Generic (PLEG): container finished" podID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerID="b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9" exitCode=0 Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.387982 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgdcd" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.388003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerDied","Data":"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9"} Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.388579 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgdcd" event={"ID":"fbca9e2b-e508-44a4-bc50-dbf37f511e63","Type":"ContainerDied","Data":"418a92de0eb48deb4a7d4754e96bf26b27dc17a6566f907abb4ac80793090e17"} Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.388612 4831 scope.go:117] "RemoveContainer" containerID="b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.404080 4831 scope.go:117] "RemoveContainer" containerID="7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.423976 4831 scope.go:117] "RemoveContainer" containerID="fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.443664 4831 scope.go:117] "RemoveContainer" containerID="b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9" Mar 09 16:15:24 crc kubenswrapper[4831]: E0309 16:15:24.444815 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9\": container with ID starting with b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9 not found: ID does not exist" containerID="b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.444868 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9"} err="failed to get container status \"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9\": rpc error: code = NotFound desc = could not find container \"b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9\": container with ID starting with b4e1b16c9f8ff634f91db460424c7d328154b4718893c23508a3753fe41dd0e9 not found: ID does not exist" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.444898 4831 scope.go:117] "RemoveContainer" containerID="7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb" Mar 09 16:15:24 crc kubenswrapper[4831]: E0309 16:15:24.445280 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb\": container with ID starting with 7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb not found: ID does not exist" containerID="7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.445303 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb"} err="failed to get container status \"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb\": rpc error: code = NotFound desc = could not find container \"7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb\": container with ID starting with 7a644d2b34e64cb054cb00ee8b14f8011afbf057838b6b0f03971f0c60edc8cb not found: ID does not exist" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.445320 4831 scope.go:117] "RemoveContainer" containerID="fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07" Mar 09 16:15:24 crc kubenswrapper[4831]: E0309 16:15:24.445635 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07\": container with ID starting with fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07 not found: ID does not exist" containerID="fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.445679 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07"} err="failed to get container status \"fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07\": rpc error: code = NotFound desc = could not find container \"fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07\": container with ID starting with fb374cd781aededaa98466d94d7f361c57733a26811fb95fb6c7ef31a02d3a07 not found: ID does not exist" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.458383 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.473115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8gx\" (UniqueName: \"kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx\") pod \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.473391 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content\") pod \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.473537 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities\") pod \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\" (UID: \"fbca9e2b-e508-44a4-bc50-dbf37f511e63\") " Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.474646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities" (OuterVolumeSpecName: "utilities") pod "fbca9e2b-e508-44a4-bc50-dbf37f511e63" (UID: "fbca9e2b-e508-44a4-bc50-dbf37f511e63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.479654 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx" (OuterVolumeSpecName: "kube-api-access-lq8gx") pod "fbca9e2b-e508-44a4-bc50-dbf37f511e63" (UID: "fbca9e2b-e508-44a4-bc50-dbf37f511e63"). InnerVolumeSpecName "kube-api-access-lq8gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.524503 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.537180 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbca9e2b-e508-44a4-bc50-dbf37f511e63" (UID: "fbca9e2b-e508-44a4-bc50-dbf37f511e63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.575591 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8gx\" (UniqueName: \"kubernetes.io/projected/fbca9e2b-e508-44a4-bc50-dbf37f511e63-kube-api-access-lq8gx\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.575628 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.575640 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbca9e2b-e508-44a4-bc50-dbf37f511e63-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.723024 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:24 crc kubenswrapper[4831]: I0309 16:15:24.727275 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mgdcd"] Mar 09 16:15:25 crc kubenswrapper[4831]: I0309 16:15:25.629498 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" path="/var/lib/kubelet/pods/fbca9e2b-e508-44a4-bc50-dbf37f511e63/volumes" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.974188 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2"] Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="extract" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975510 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="extract" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975554 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f34546-4a53-44eb-bac8-59527096a882" containerName="mariadb-account-create-update" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975573 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f34546-4a53-44eb-bac8-59527096a882" containerName="mariadb-account-create-update" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975605 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="registry-server" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975625 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="registry-server" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975652 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="extract-utilities" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975671 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="extract-utilities" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975712 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="extract-content" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975730 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="extract-content" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975755 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="util" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975773 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="util" Mar 09 16:15:33 crc kubenswrapper[4831]: E0309 16:15:33.975791 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="pull" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.975810 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="pull" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.976117 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f34546-4a53-44eb-bac8-59527096a882" containerName="mariadb-account-create-update" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.976158 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3b8608-bf80-4c51-a661-65b0c5056ecc" containerName="extract" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.976201 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbca9e2b-e508-44a4-bc50-dbf37f511e63" containerName="registry-server" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.977149 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.984265 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-j29db" Mar 09 16:15:33 crc kubenswrapper[4831]: I0309 16:15:33.985846 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2"] Mar 09 16:15:34 crc kubenswrapper[4831]: I0309 16:15:34.104977 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7ftt\" (UniqueName: \"kubernetes.io/projected/c4a0d3e8-756a-499f-bfd7-a720f83cbd6e-kube-api-access-x7ftt\") pod \"rabbitmq-cluster-operator-779fc9694b-s5zp2\" (UID: \"c4a0d3e8-756a-499f-bfd7-a720f83cbd6e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" Mar 09 16:15:34 crc kubenswrapper[4831]: I0309 16:15:34.206099 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7ftt\" (UniqueName: \"kubernetes.io/projected/c4a0d3e8-756a-499f-bfd7-a720f83cbd6e-kube-api-access-x7ftt\") pod \"rabbitmq-cluster-operator-779fc9694b-s5zp2\" (UID: \"c4a0d3e8-756a-499f-bfd7-a720f83cbd6e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" Mar 09 16:15:34 crc kubenswrapper[4831]: I0309 16:15:34.246941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7ftt\" (UniqueName: \"kubernetes.io/projected/c4a0d3e8-756a-499f-bfd7-a720f83cbd6e-kube-api-access-x7ftt\") pod \"rabbitmq-cluster-operator-779fc9694b-s5zp2\" (UID: \"c4a0d3e8-756a-499f-bfd7-a720f83cbd6e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" Mar 09 16:15:34 crc kubenswrapper[4831]: I0309 16:15:34.299637 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" Mar 09 16:15:34 crc kubenswrapper[4831]: I0309 16:15:34.542136 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2"] Mar 09 16:15:35 crc kubenswrapper[4831]: I0309 16:15:35.471828 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" event={"ID":"c4a0d3e8-756a-499f-bfd7-a720f83cbd6e","Type":"ContainerStarted","Data":"eae92ebe5c73492c95bbea75e5a9a4bd1061c5fc7eacb87c303ba49e91d778f3"} Mar 09 16:15:38 crc kubenswrapper[4831]: I0309 16:15:38.493017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" event={"ID":"c4a0d3e8-756a-499f-bfd7-a720f83cbd6e","Type":"ContainerStarted","Data":"8c244a8e406e58265ed4bc7d6699e048e7cf110bb07e4043f8d475aadb8719fd"} Mar 09 16:15:38 crc kubenswrapper[4831]: I0309 16:15:38.513144 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-s5zp2" podStartSLOduration=2.360335842 podStartE2EDuration="5.513128339s" podCreationTimestamp="2026-03-09 16:15:33 +0000 UTC" firstStartedPulling="2026-03-09 16:15:34.56638043 +0000 UTC m=+1061.700062853" lastFinishedPulling="2026-03-09 16:15:37.719172927 +0000 UTC m=+1064.852855350" observedRunningTime="2026-03-09 16:15:38.508468215 +0000 UTC m=+1065.642150668" watchObservedRunningTime="2026-03-09 16:15:38.513128339 +0000 UTC m=+1065.646810762" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.099323 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.101026 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.108821 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.228981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9dt2\" (UniqueName: \"kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.229352 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.229373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.330451 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.330504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.330560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9dt2\" (UniqueName: \"kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.331259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.331377 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.351584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9dt2\" (UniqueName: \"kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2\") pod \"redhat-operators-zkj2p\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.420679 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:42 crc kubenswrapper[4831]: I0309 16:15:42.673964 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:15:42 crc kubenswrapper[4831]: W0309 16:15:42.676811 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod947c9e6b_705d_48a9_913f_c431b61a124d.slice/crio-91f625ae87010fe2e7ef573c403b6b5a279ddeb44efd58beb662d5b0b2e39fe8 WatchSource:0}: Error finding container 91f625ae87010fe2e7ef573c403b6b5a279ddeb44efd58beb662d5b0b2e39fe8: Status 404 returned error can't find the container with id 91f625ae87010fe2e7ef573c403b6b5a279ddeb44efd58beb662d5b0b2e39fe8 Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.400203 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.401752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.404176 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.404377 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-default-user" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.404541 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-server-conf" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.404740 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-plugins-conf" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.405583 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-server-dockercfg-7t4nb" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.424151 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.528695 4831 generic.go:334] "Generic (PLEG): container finished" podID="947c9e6b-705d-48a9-913f-c431b61a124d" containerID="cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53" exitCode=0 Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.528737 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerDied","Data":"cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53"} Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.528762 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerStarted","Data":"91f625ae87010fe2e7ef573c403b6b5a279ddeb44efd58beb662d5b0b2e39fe8"} Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d24e185f-9a88-42b9-867e-2814f11c820e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545231 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d24e185f-9a88-42b9-867e-2814f11c820e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkz6n\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-kube-api-access-zkz6n\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.545838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d24e185f-9a88-42b9-867e-2814f11c820e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647255 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d24e185f-9a88-42b9-867e-2814f11c820e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647537 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkz6n\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-kube-api-access-zkz6n\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d24e185f-9a88-42b9-867e-2814f11c820e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647619 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d24e185f-9a88-42b9-867e-2814f11c820e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.647641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.648110 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.649852 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.653916 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.653966 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e638ac475a7cd5697ee0f215bd3a893ddf2c1f3e228232de12e64494144cf855/globalmount\"" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.655015 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d24e185f-9a88-42b9-867e-2814f11c820e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.660452 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.662846 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d24e185f-9a88-42b9-867e-2814f11c820e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.665994 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d24e185f-9a88-42b9-867e-2814f11c820e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.677036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkz6n\" (UniqueName: \"kubernetes.io/projected/d24e185f-9a88-42b9-867e-2814f11c820e-kube-api-access-zkz6n\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.699185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2082f0c8-552b-4580-a7a4-40d0ba766446\") pod \"rabbitmq-server-0\" (UID: \"d24e185f-9a88-42b9-867e-2814f11c820e\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:43 crc kubenswrapper[4831]: I0309 16:15:43.721729 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:15:44 crc kubenswrapper[4831]: I0309 16:15:44.154828 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 16:15:44 crc kubenswrapper[4831]: W0309 16:15:44.156460 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24e185f_9a88_42b9_867e_2814f11c820e.slice/crio-306eb219b3637d2694342529c848fc00d781fc293e39e2e09188cd1c0be287bb WatchSource:0}: Error finding container 306eb219b3637d2694342529c848fc00d781fc293e39e2e09188cd1c0be287bb: Status 404 returned error can't find the container with id 306eb219b3637d2694342529c848fc00d781fc293e39e2e09188cd1c0be287bb Mar 09 16:15:44 crc kubenswrapper[4831]: I0309 16:15:44.536490 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"d24e185f-9a88-42b9-867e-2814f11c820e","Type":"ContainerStarted","Data":"306eb219b3637d2694342529c848fc00d781fc293e39e2e09188cd1c0be287bb"} Mar 09 16:15:44 crc kubenswrapper[4831]: I0309 16:15:44.538124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerStarted","Data":"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5"} Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.548721 4831 generic.go:334] "Generic (PLEG): container finished" podID="947c9e6b-705d-48a9-913f-c431b61a124d" containerID="5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5" exitCode=0 Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.548788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerDied","Data":"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5"} Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.895974 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.898848 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.901365 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-d4jqq" Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.903482 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:45 crc kubenswrapper[4831]: I0309 16:15:45.989050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcnf\" (UniqueName: \"kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf\") pod \"keystone-operator-index-w8zv2\" (UID: \"bc1e53e2-055a-46d3-9914-bde6e36ee280\") " pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:46 crc kubenswrapper[4831]: I0309 16:15:46.091122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcnf\" (UniqueName: \"kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf\") pod \"keystone-operator-index-w8zv2\" (UID: \"bc1e53e2-055a-46d3-9914-bde6e36ee280\") " pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:46 crc kubenswrapper[4831]: I0309 16:15:46.111460 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcnf\" (UniqueName: \"kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf\") pod \"keystone-operator-index-w8zv2\" (UID: \"bc1e53e2-055a-46d3-9914-bde6e36ee280\") " pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:46 crc kubenswrapper[4831]: I0309 16:15:46.220491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:47 crc kubenswrapper[4831]: I0309 16:15:47.098079 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:47 crc kubenswrapper[4831]: W0309 16:15:47.105800 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1e53e2_055a_46d3_9914_bde6e36ee280.slice/crio-06cce0bdacc017e61e195682d7e0436edc437360d666983b2381bbe4dd830512 WatchSource:0}: Error finding container 06cce0bdacc017e61e195682d7e0436edc437360d666983b2381bbe4dd830512: Status 404 returned error can't find the container with id 06cce0bdacc017e61e195682d7e0436edc437360d666983b2381bbe4dd830512 Mar 09 16:15:47 crc kubenswrapper[4831]: I0309 16:15:47.568845 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerStarted","Data":"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b"} Mar 09 16:15:47 crc kubenswrapper[4831]: I0309 16:15:47.570896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-w8zv2" event={"ID":"bc1e53e2-055a-46d3-9914-bde6e36ee280","Type":"ContainerStarted","Data":"06cce0bdacc017e61e195682d7e0436edc437360d666983b2381bbe4dd830512"} Mar 09 16:15:47 crc kubenswrapper[4831]: I0309 16:15:47.592138 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkj2p" podStartSLOduration=2.385435119 podStartE2EDuration="5.592117545s" podCreationTimestamp="2026-03-09 16:15:42 +0000 UTC" firstStartedPulling="2026-03-09 16:15:43.53000916 +0000 UTC m=+1070.663691583" lastFinishedPulling="2026-03-09 16:15:46.736691586 +0000 UTC m=+1073.870374009" observedRunningTime="2026-03-09 16:15:47.585696991 +0000 UTC m=+1074.719379434" watchObservedRunningTime="2026-03-09 16:15:47.592117545 +0000 UTC m=+1074.725799968" Mar 09 16:15:50 crc kubenswrapper[4831]: I0309 16:15:50.593070 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-w8zv2" event={"ID":"bc1e53e2-055a-46d3-9914-bde6e36ee280","Type":"ContainerStarted","Data":"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965"} Mar 09 16:15:50 crc kubenswrapper[4831]: I0309 16:15:50.615628 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-w8zv2" podStartSLOduration=2.524492142 podStartE2EDuration="5.615605347s" podCreationTimestamp="2026-03-09 16:15:45 +0000 UTC" firstStartedPulling="2026-03-09 16:15:47.108332099 +0000 UTC m=+1074.242014522" lastFinishedPulling="2026-03-09 16:15:50.199445304 +0000 UTC m=+1077.333127727" observedRunningTime="2026-03-09 16:15:50.609040398 +0000 UTC m=+1077.742722831" watchObservedRunningTime="2026-03-09 16:15:50.615605347 +0000 UTC m=+1077.749287800" Mar 09 16:15:51 crc kubenswrapper[4831]: I0309 16:15:51.602305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"d24e185f-9a88-42b9-867e-2814f11c820e","Type":"ContainerStarted","Data":"7b90866109d2cbafc0acbee2c100ea09b87c553ad8b4904db9499a18c0f3ac17"} Mar 09 16:15:51 crc kubenswrapper[4831]: I0309 16:15:51.891305 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.421385 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.421453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.609548 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-w8zv2" podUID="bc1e53e2-055a-46d3-9914-bde6e36ee280" containerName="registry-server" containerID="cri-o://5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965" gracePeriod=2 Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.697321 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-sf6xv"] Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.698138 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.705981 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-sf6xv"] Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.794235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gprcx\" (UniqueName: \"kubernetes.io/projected/70894f60-1ec0-4a31-a25d-3acc3e358869-kube-api-access-gprcx\") pod \"keystone-operator-index-sf6xv\" (UID: \"70894f60-1ec0-4a31-a25d-3acc3e358869\") " pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.895637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gprcx\" (UniqueName: \"kubernetes.io/projected/70894f60-1ec0-4a31-a25d-3acc3e358869-kube-api-access-gprcx\") pod \"keystone-operator-index-sf6xv\" (UID: \"70894f60-1ec0-4a31-a25d-3acc3e358869\") " pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:15:52 crc kubenswrapper[4831]: I0309 16:15:52.925515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gprcx\" (UniqueName: \"kubernetes.io/projected/70894f60-1ec0-4a31-a25d-3acc3e358869-kube-api-access-gprcx\") pod \"keystone-operator-index-sf6xv\" (UID: \"70894f60-1ec0-4a31-a25d-3acc3e358869\") " pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.047923 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.070830 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.201771 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcnf\" (UniqueName: \"kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf\") pod \"bc1e53e2-055a-46d3-9914-bde6e36ee280\" (UID: \"bc1e53e2-055a-46d3-9914-bde6e36ee280\") " Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.207980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf" (OuterVolumeSpecName: "kube-api-access-fhcnf") pod "bc1e53e2-055a-46d3-9914-bde6e36ee280" (UID: "bc1e53e2-055a-46d3-9914-bde6e36ee280"). InnerVolumeSpecName "kube-api-access-fhcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.303923 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcnf\" (UniqueName: \"kubernetes.io/projected/bc1e53e2-055a-46d3-9914-bde6e36ee280-kube-api-access-fhcnf\") on node \"crc\" DevicePath \"\"" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.459954 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zkj2p" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="registry-server" probeResult="failure" output=< Mar 09 16:15:53 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:15:53 crc kubenswrapper[4831]: > Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.472957 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-sf6xv"] Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.622135 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc1e53e2-055a-46d3-9914-bde6e36ee280" containerID="5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965" exitCode=0 Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.624806 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-w8zv2" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.625945 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-sf6xv" event={"ID":"70894f60-1ec0-4a31-a25d-3acc3e358869","Type":"ContainerStarted","Data":"040f8db937a82bd4cf04425ef78388fe40224b1db5ff5cb423c0f0fc488cac89"} Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.625998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-w8zv2" event={"ID":"bc1e53e2-055a-46d3-9914-bde6e36ee280","Type":"ContainerDied","Data":"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965"} Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.626032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-w8zv2" event={"ID":"bc1e53e2-055a-46d3-9914-bde6e36ee280","Type":"ContainerDied","Data":"06cce0bdacc017e61e195682d7e0436edc437360d666983b2381bbe4dd830512"} Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.626061 4831 scope.go:117] "RemoveContainer" containerID="5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.654570 4831 scope.go:117] "RemoveContainer" containerID="5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965" Mar 09 16:15:53 crc kubenswrapper[4831]: E0309 16:15:53.658725 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965\": container with ID starting with 5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965 not found: ID does not exist" containerID="5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.658782 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965"} err="failed to get container status \"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965\": rpc error: code = NotFound desc = could not find container \"5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965\": container with ID starting with 5a8a25f44017c896b31ee4119fd4c366867450d6830dbeda6f5819df035b6965 not found: ID does not exist" Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.668661 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:53 crc kubenswrapper[4831]: I0309 16:15:53.673543 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-w8zv2"] Mar 09 16:15:54 crc kubenswrapper[4831]: I0309 16:15:54.646114 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-sf6xv" event={"ID":"70894f60-1ec0-4a31-a25d-3acc3e358869","Type":"ContainerStarted","Data":"d3a7e2e1d8d7b8243d4e06181938d1f4c6974a1a57b7b2dd787d7f073766fed7"} Mar 09 16:15:54 crc kubenswrapper[4831]: I0309 16:15:54.681071 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-sf6xv" podStartSLOduration=2.229126988 podStartE2EDuration="2.681044568s" podCreationTimestamp="2026-03-09 16:15:52 +0000 UTC" firstStartedPulling="2026-03-09 16:15:53.48992018 +0000 UTC m=+1080.623602603" lastFinishedPulling="2026-03-09 16:15:53.94183773 +0000 UTC m=+1081.075520183" observedRunningTime="2026-03-09 16:15:54.674055797 +0000 UTC m=+1081.807738300" watchObservedRunningTime="2026-03-09 16:15:54.681044568 +0000 UTC m=+1081.814727021" Mar 09 16:15:55 crc kubenswrapper[4831]: I0309 16:15:55.625201 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1e53e2-055a-46d3-9914-bde6e36ee280" path="/var/lib/kubelet/pods/bc1e53e2-055a-46d3-9914-bde6e36ee280/volumes" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.133007 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551216-p5vkf"] Mar 09 16:16:00 crc kubenswrapper[4831]: E0309 16:16:00.133809 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1e53e2-055a-46d3-9914-bde6e36ee280" containerName="registry-server" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.133823 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1e53e2-055a-46d3-9914-bde6e36ee280" containerName="registry-server" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.133960 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1e53e2-055a-46d3-9914-bde6e36ee280" containerName="registry-server" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.134453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.138528 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.138541 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.138745 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.152743 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551216-p5vkf"] Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.220292 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm22p\" (UniqueName: \"kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p\") pod \"auto-csr-approver-29551216-p5vkf\" (UID: \"16d0bed7-6225-44f3-b82b-afcbf1ea42dc\") " pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.322467 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm22p\" (UniqueName: \"kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p\") pod \"auto-csr-approver-29551216-p5vkf\" (UID: \"16d0bed7-6225-44f3-b82b-afcbf1ea42dc\") " pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.349450 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm22p\" (UniqueName: \"kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p\") pod \"auto-csr-approver-29551216-p5vkf\" (UID: \"16d0bed7-6225-44f3-b82b-afcbf1ea42dc\") " pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.467679 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:00 crc kubenswrapper[4831]: I0309 16:16:00.919789 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551216-p5vkf"] Mar 09 16:16:01 crc kubenswrapper[4831]: I0309 16:16:01.707156 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" event={"ID":"16d0bed7-6225-44f3-b82b-afcbf1ea42dc","Type":"ContainerStarted","Data":"434bf835002d782926ee8288248cb47548e09036dee96e8c19ab92911fd4e631"} Mar 09 16:16:02 crc kubenswrapper[4831]: I0309 16:16:02.476097 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:16:02 crc kubenswrapper[4831]: I0309 16:16:02.518423 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:16:02 crc kubenswrapper[4831]: I0309 16:16:02.713516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" event={"ID":"16d0bed7-6225-44f3-b82b-afcbf1ea42dc","Type":"ContainerStarted","Data":"21d0fb805a97bed1eaa12abf2728c7e45f29ecaaa5bbe03c3c2635f2690d267e"} Mar 09 16:16:02 crc kubenswrapper[4831]: I0309 16:16:02.729346 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" podStartSLOduration=1.4207652720000001 podStartE2EDuration="2.729325797s" podCreationTimestamp="2026-03-09 16:16:00 +0000 UTC" firstStartedPulling="2026-03-09 16:16:00.955255171 +0000 UTC m=+1088.088937594" lastFinishedPulling="2026-03-09 16:16:02.263815666 +0000 UTC m=+1089.397498119" observedRunningTime="2026-03-09 16:16:02.72734389 +0000 UTC m=+1089.861026323" watchObservedRunningTime="2026-03-09 16:16:02.729325797 +0000 UTC m=+1089.863008220" Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.071879 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.071947 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.108576 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.723910 4831 generic.go:334] "Generic (PLEG): container finished" podID="16d0bed7-6225-44f3-b82b-afcbf1ea42dc" containerID="21d0fb805a97bed1eaa12abf2728c7e45f29ecaaa5bbe03c3c2635f2690d267e" exitCode=0 Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.723954 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" event={"ID":"16d0bed7-6225-44f3-b82b-afcbf1ea42dc","Type":"ContainerDied","Data":"21d0fb805a97bed1eaa12abf2728c7e45f29ecaaa5bbe03c3c2635f2690d267e"} Mar 09 16:16:03 crc kubenswrapper[4831]: I0309 16:16:03.757369 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-sf6xv" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.072224 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.197144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm22p\" (UniqueName: \"kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p\") pod \"16d0bed7-6225-44f3-b82b-afcbf1ea42dc\" (UID: \"16d0bed7-6225-44f3-b82b-afcbf1ea42dc\") " Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.205084 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p" (OuterVolumeSpecName: "kube-api-access-qm22p") pod "16d0bed7-6225-44f3-b82b-afcbf1ea42dc" (UID: "16d0bed7-6225-44f3-b82b-afcbf1ea42dc"). InnerVolumeSpecName "kube-api-access-qm22p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.299449 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm22p\" (UniqueName: \"kubernetes.io/projected/16d0bed7-6225-44f3-b82b-afcbf1ea42dc-kube-api-access-qm22p\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.741346 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" event={"ID":"16d0bed7-6225-44f3-b82b-afcbf1ea42dc","Type":"ContainerDied","Data":"434bf835002d782926ee8288248cb47548e09036dee96e8c19ab92911fd4e631"} Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.741703 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434bf835002d782926ee8288248cb47548e09036dee96e8c19ab92911fd4e631" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.741418 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551216-p5vkf" Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.776172 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551210-rtbsr"] Mar 09 16:16:05 crc kubenswrapper[4831]: I0309 16:16:05.779972 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551210-rtbsr"] Mar 09 16:16:06 crc kubenswrapper[4831]: I0309 16:16:06.692553 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:16:06 crc kubenswrapper[4831]: I0309 16:16:06.692983 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkj2p" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="registry-server" containerID="cri-o://50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b" gracePeriod=2 Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.086485 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.236820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content\") pod \"947c9e6b-705d-48a9-913f-c431b61a124d\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.236960 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities\") pod \"947c9e6b-705d-48a9-913f-c431b61a124d\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.237030 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9dt2\" (UniqueName: \"kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2\") pod \"947c9e6b-705d-48a9-913f-c431b61a124d\" (UID: \"947c9e6b-705d-48a9-913f-c431b61a124d\") " Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.237657 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities" (OuterVolumeSpecName: "utilities") pod "947c9e6b-705d-48a9-913f-c431b61a124d" (UID: "947c9e6b-705d-48a9-913f-c431b61a124d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.243033 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2" (OuterVolumeSpecName: "kube-api-access-n9dt2") pod "947c9e6b-705d-48a9-913f-c431b61a124d" (UID: "947c9e6b-705d-48a9-913f-c431b61a124d"). InnerVolumeSpecName "kube-api-access-n9dt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.339231 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.339277 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9dt2\" (UniqueName: \"kubernetes.io/projected/947c9e6b-705d-48a9-913f-c431b61a124d-kube-api-access-n9dt2\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.362234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "947c9e6b-705d-48a9-913f-c431b61a124d" (UID: "947c9e6b-705d-48a9-913f-c431b61a124d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.440239 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947c9e6b-705d-48a9-913f-c431b61a124d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.625275 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3436d06-d059-4710-9ab2-97360867646b" path="/var/lib/kubelet/pods/e3436d06-d059-4710-9ab2-97360867646b/volumes" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.755627 4831 generic.go:334] "Generic (PLEG): container finished" podID="947c9e6b-705d-48a9-913f-c431b61a124d" containerID="50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b" exitCode=0 Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.755673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerDied","Data":"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b"} Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.755704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkj2p" event={"ID":"947c9e6b-705d-48a9-913f-c431b61a124d","Type":"ContainerDied","Data":"91f625ae87010fe2e7ef573c403b6b5a279ddeb44efd58beb662d5b0b2e39fe8"} Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.755724 4831 scope.go:117] "RemoveContainer" containerID="50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.755865 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkj2p" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.778253 4831 scope.go:117] "RemoveContainer" containerID="5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.795561 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.812679 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkj2p"] Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.813987 4831 scope.go:117] "RemoveContainer" containerID="cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.849830 4831 scope.go:117] "RemoveContainer" containerID="50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b" Mar 09 16:16:07 crc kubenswrapper[4831]: E0309 16:16:07.850605 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b\": container with ID starting with 50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b not found: ID does not exist" containerID="50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.850663 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b"} err="failed to get container status \"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b\": rpc error: code = NotFound desc = could not find container \"50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b\": container with ID starting with 50fd0311c549d5454ef2088ca15a10788859ffe6afd9fe783e325070671fde5b not found: ID does not exist" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.850696 4831 scope.go:117] "RemoveContainer" containerID="5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5" Mar 09 16:16:07 crc kubenswrapper[4831]: E0309 16:16:07.851063 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5\": container with ID starting with 5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5 not found: ID does not exist" containerID="5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.851093 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5"} err="failed to get container status \"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5\": rpc error: code = NotFound desc = could not find container \"5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5\": container with ID starting with 5245eb61aad83c3f2c4f7897d4f40df996eb17bd1a44997a7f05270a6fda43f5 not found: ID does not exist" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.851113 4831 scope.go:117] "RemoveContainer" containerID="cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53" Mar 09 16:16:07 crc kubenswrapper[4831]: E0309 16:16:07.851420 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53\": container with ID starting with cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53 not found: ID does not exist" containerID="cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53" Mar 09 16:16:07 crc kubenswrapper[4831]: I0309 16:16:07.851440 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53"} err="failed to get container status \"cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53\": rpc error: code = NotFound desc = could not find container \"cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53\": container with ID starting with cd8d406e71d50ff5605942e9fb26d82c841b31dd5eb94678e9ea82afd918cd53 not found: ID does not exist" Mar 09 16:16:09 crc kubenswrapper[4831]: I0309 16:16:09.633143 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" path="/var/lib/kubelet/pods/947c9e6b-705d-48a9-913f-c431b61a124d/volumes" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.366631 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg"] Mar 09 16:16:18 crc kubenswrapper[4831]: E0309 16:16:18.367729 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="extract-content" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.367750 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="extract-content" Mar 09 16:16:18 crc kubenswrapper[4831]: E0309 16:16:18.367768 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="registry-server" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.367779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="registry-server" Mar 09 16:16:18 crc kubenswrapper[4831]: E0309 16:16:18.367803 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d0bed7-6225-44f3-b82b-afcbf1ea42dc" containerName="oc" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.367814 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d0bed7-6225-44f3-b82b-afcbf1ea42dc" containerName="oc" Mar 09 16:16:18 crc kubenswrapper[4831]: E0309 16:16:18.367836 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="extract-utilities" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.367846 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="extract-utilities" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.368022 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="947c9e6b-705d-48a9-913f-c431b61a124d" containerName="registry-server" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.368050 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d0bed7-6225-44f3-b82b-afcbf1ea42dc" containerName="oc" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.369762 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.376523 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.384306 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg"] Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.502020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.502116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvgg\" (UniqueName: \"kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.502179 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.603806 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.603948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.603996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvgg\" (UniqueName: \"kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.604515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.604960 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.628376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvgg\" (UniqueName: \"kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:18 crc kubenswrapper[4831]: I0309 16:16:18.693705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:19 crc kubenswrapper[4831]: I0309 16:16:19.113338 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg"] Mar 09 16:16:19 crc kubenswrapper[4831]: I0309 16:16:19.860887 4831 generic.go:334] "Generic (PLEG): container finished" podID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerID="2298dc5d0e7a0af710c5ea13bbaa21ebc48ddb4c7b8b702eab3e8038567c5045" exitCode=0 Mar 09 16:16:19 crc kubenswrapper[4831]: I0309 16:16:19.860943 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" event={"ID":"5349223e-9c5e-4621-b8c0-d7ee2e192d46","Type":"ContainerDied","Data":"2298dc5d0e7a0af710c5ea13bbaa21ebc48ddb4c7b8b702eab3e8038567c5045"} Mar 09 16:16:19 crc kubenswrapper[4831]: I0309 16:16:19.860973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" event={"ID":"5349223e-9c5e-4621-b8c0-d7ee2e192d46","Type":"ContainerStarted","Data":"357969caae86ffd07aafa0c9e4144d23702b3eb7eba2ccb2a667039f8171a727"} Mar 09 16:16:20 crc kubenswrapper[4831]: I0309 16:16:20.870659 4831 generic.go:334] "Generic (PLEG): container finished" podID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerID="1a31b92b93cb1564561d1a2b0e8c1dfe1bcd3a594236c9391f7a01113f29061d" exitCode=0 Mar 09 16:16:20 crc kubenswrapper[4831]: I0309 16:16:20.870785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" event={"ID":"5349223e-9c5e-4621-b8c0-d7ee2e192d46","Type":"ContainerDied","Data":"1a31b92b93cb1564561d1a2b0e8c1dfe1bcd3a594236c9391f7a01113f29061d"} Mar 09 16:16:21 crc kubenswrapper[4831]: I0309 16:16:21.879675 4831 generic.go:334] "Generic (PLEG): container finished" podID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerID="7c0dc6cd50794fdd00954f947135ff9ba73b77492a6574fb279563cc919a175b" exitCode=0 Mar 09 16:16:21 crc kubenswrapper[4831]: I0309 16:16:21.879801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" event={"ID":"5349223e-9c5e-4621-b8c0-d7ee2e192d46","Type":"ContainerDied","Data":"7c0dc6cd50794fdd00954f947135ff9ba73b77492a6574fb279563cc919a175b"} Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.202357 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.371038 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util\") pod \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.371215 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cvgg\" (UniqueName: \"kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg\") pod \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.371244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle\") pod \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\" (UID: \"5349223e-9c5e-4621-b8c0-d7ee2e192d46\") " Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.372233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle" (OuterVolumeSpecName: "bundle") pod "5349223e-9c5e-4621-b8c0-d7ee2e192d46" (UID: "5349223e-9c5e-4621-b8c0-d7ee2e192d46"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.378659 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg" (OuterVolumeSpecName: "kube-api-access-4cvgg") pod "5349223e-9c5e-4621-b8c0-d7ee2e192d46" (UID: "5349223e-9c5e-4621-b8c0-d7ee2e192d46"). InnerVolumeSpecName "kube-api-access-4cvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.384617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util" (OuterVolumeSpecName: "util") pod "5349223e-9c5e-4621-b8c0-d7ee2e192d46" (UID: "5349223e-9c5e-4621-b8c0-d7ee2e192d46"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.473165 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cvgg\" (UniqueName: \"kubernetes.io/projected/5349223e-9c5e-4621-b8c0-d7ee2e192d46-kube-api-access-4cvgg\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.473208 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.473220 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5349223e-9c5e-4621-b8c0-d7ee2e192d46-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.892843 4831 generic.go:334] "Generic (PLEG): container finished" podID="d24e185f-9a88-42b9-867e-2814f11c820e" containerID="7b90866109d2cbafc0acbee2c100ea09b87c553ad8b4904db9499a18c0f3ac17" exitCode=0 Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.892910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"d24e185f-9a88-42b9-867e-2814f11c820e","Type":"ContainerDied","Data":"7b90866109d2cbafc0acbee2c100ea09b87c553ad8b4904db9499a18c0f3ac17"} Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.896893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" event={"ID":"5349223e-9c5e-4621-b8c0-d7ee2e192d46","Type":"ContainerDied","Data":"357969caae86ffd07aafa0c9e4144d23702b3eb7eba2ccb2a667039f8171a727"} Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.897183 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357969caae86ffd07aafa0c9e4144d23702b3eb7eba2ccb2a667039f8171a727" Mar 09 16:16:23 crc kubenswrapper[4831]: I0309 16:16:23.896982 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg" Mar 09 16:16:24 crc kubenswrapper[4831]: I0309 16:16:24.903574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"d24e185f-9a88-42b9-867e-2814f11c820e","Type":"ContainerStarted","Data":"22ac8d4e5fe5798891b1fd1567bc093b74564328dbf9fc30edff0ee48004cf9c"} Mar 09 16:16:24 crc kubenswrapper[4831]: I0309 16:16:24.903835 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:16:24 crc kubenswrapper[4831]: I0309 16:16:24.929725 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.932018716 podStartE2EDuration="42.92969993s" podCreationTimestamp="2026-03-09 16:15:42 +0000 UTC" firstStartedPulling="2026-03-09 16:15:44.160106032 +0000 UTC m=+1071.293788485" lastFinishedPulling="2026-03-09 16:15:50.157787276 +0000 UTC m=+1077.291469699" observedRunningTime="2026-03-09 16:16:24.924224673 +0000 UTC m=+1112.057907106" watchObservedRunningTime="2026-03-09 16:16:24.92969993 +0000 UTC m=+1112.063382363" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.684972 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv"] Mar 09 16:16:34 crc kubenswrapper[4831]: E0309 16:16:34.685694 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="pull" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.685705 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="pull" Mar 09 16:16:34 crc kubenswrapper[4831]: E0309 16:16:34.685727 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="util" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.685733 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="util" Mar 09 16:16:34 crc kubenswrapper[4831]: E0309 16:16:34.685742 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="extract" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.685747 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="extract" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.685852 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5349223e-9c5e-4621-b8c0-d7ee2e192d46" containerName="extract" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.686260 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.688174 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.688354 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-h26rv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.704068 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv"] Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.725878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-webhook-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.725927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gn2t\" (UniqueName: \"kubernetes.io/projected/f7e6dce0-2482-4e60-88f0-d53f94df4a68-kube-api-access-2gn2t\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.726202 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-apiservice-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.828012 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-webhook-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.828100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gn2t\" (UniqueName: \"kubernetes.io/projected/f7e6dce0-2482-4e60-88f0-d53f94df4a68-kube-api-access-2gn2t\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.828275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-apiservice-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.835847 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-apiservice-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.836834 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7e6dce0-2482-4e60-88f0-d53f94df4a68-webhook-cert\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:34 crc kubenswrapper[4831]: I0309 16:16:34.849580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gn2t\" (UniqueName: \"kubernetes.io/projected/f7e6dce0-2482-4e60-88f0-d53f94df4a68-kube-api-access-2gn2t\") pod \"keystone-operator-controller-manager-78fb4b8689-9qprv\" (UID: \"f7e6dce0-2482-4e60-88f0-d53f94df4a68\") " pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:35 crc kubenswrapper[4831]: I0309 16:16:35.005194 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:35 crc kubenswrapper[4831]: I0309 16:16:35.450793 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv"] Mar 09 16:16:35 crc kubenswrapper[4831]: I0309 16:16:35.982337 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" event={"ID":"f7e6dce0-2482-4e60-88f0-d53f94df4a68","Type":"ContainerStarted","Data":"5a7cbbc9adf68819e1fbebc4acea09a9f95dc7afa8a67dae732b122acba02bdb"} Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.108668 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.110064 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.123833 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.257373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.257499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.257657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5h9\" (UniqueName: \"kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.358994 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5h9\" (UniqueName: \"kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.359087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.359152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.359618 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.359643 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.379148 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5h9\" (UniqueName: \"kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9\") pod \"certified-operators-949hm\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.429135 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:36 crc kubenswrapper[4831]: I0309 16:16:36.966182 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:36 crc kubenswrapper[4831]: W0309 16:16:36.973165 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc466dab8_0c65_4596_9151_725dff336993.slice/crio-5d051046edeba3d036e9355fbcbbc3f8d3e9bae7a2b5f0ac2a090838b6b8dc4a WatchSource:0}: Error finding container 5d051046edeba3d036e9355fbcbbc3f8d3e9bae7a2b5f0ac2a090838b6b8dc4a: Status 404 returned error can't find the container with id 5d051046edeba3d036e9355fbcbbc3f8d3e9bae7a2b5f0ac2a090838b6b8dc4a Mar 09 16:16:37 crc kubenswrapper[4831]: I0309 16:16:37.016212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerStarted","Data":"5d051046edeba3d036e9355fbcbbc3f8d3e9bae7a2b5f0ac2a090838b6b8dc4a"} Mar 09 16:16:38 crc kubenswrapper[4831]: I0309 16:16:38.022377 4831 generic.go:334] "Generic (PLEG): container finished" podID="c466dab8-0c65-4596-9151-725dff336993" containerID="4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605" exitCode=0 Mar 09 16:16:38 crc kubenswrapper[4831]: I0309 16:16:38.022519 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerDied","Data":"4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605"} Mar 09 16:16:40 crc kubenswrapper[4831]: I0309 16:16:40.059078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" event={"ID":"f7e6dce0-2482-4e60-88f0-d53f94df4a68","Type":"ContainerStarted","Data":"6ba2437f278950c36906edb0f3b5c6f90a5c181d85d76debd863af0477815871"} Mar 09 16:16:40 crc kubenswrapper[4831]: I0309 16:16:40.059429 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:40 crc kubenswrapper[4831]: I0309 16:16:40.080724 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" podStartSLOduration=2.033107668 podStartE2EDuration="6.080706453s" podCreationTimestamp="2026-03-09 16:16:34 +0000 UTC" firstStartedPulling="2026-03-09 16:16:35.460836254 +0000 UTC m=+1122.594518677" lastFinishedPulling="2026-03-09 16:16:39.508435039 +0000 UTC m=+1126.642117462" observedRunningTime="2026-03-09 16:16:40.076568744 +0000 UTC m=+1127.210251157" watchObservedRunningTime="2026-03-09 16:16:40.080706453 +0000 UTC m=+1127.214388876" Mar 09 16:16:41 crc kubenswrapper[4831]: I0309 16:16:41.067386 4831 generic.go:334] "Generic (PLEG): container finished" podID="c466dab8-0c65-4596-9151-725dff336993" containerID="a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30" exitCode=0 Mar 09 16:16:41 crc kubenswrapper[4831]: I0309 16:16:41.067427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerDied","Data":"a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30"} Mar 09 16:16:42 crc kubenswrapper[4831]: I0309 16:16:42.087551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerStarted","Data":"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5"} Mar 09 16:16:42 crc kubenswrapper[4831]: I0309 16:16:42.116855 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-949hm" podStartSLOduration=3.109281094 podStartE2EDuration="6.116827523s" podCreationTimestamp="2026-03-09 16:16:36 +0000 UTC" firstStartedPulling="2026-03-09 16:16:38.857174721 +0000 UTC m=+1125.990857144" lastFinishedPulling="2026-03-09 16:16:41.86472115 +0000 UTC m=+1128.998403573" observedRunningTime="2026-03-09 16:16:42.106030063 +0000 UTC m=+1129.239712496" watchObservedRunningTime="2026-03-09 16:16:42.116827523 +0000 UTC m=+1129.250509986" Mar 09 16:16:43 crc kubenswrapper[4831]: I0309 16:16:43.724663 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 16:16:45 crc kubenswrapper[4831]: I0309 16:16:45.010150 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-78fb4b8689-9qprv" Mar 09 16:16:46 crc kubenswrapper[4831]: I0309 16:16:46.429864 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:46 crc kubenswrapper[4831]: I0309 16:16:46.430691 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:46 crc kubenswrapper[4831]: I0309 16:16:46.481300 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:47 crc kubenswrapper[4831]: I0309 16:16:47.164735 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.105819 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.420834 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-create-ftmbs"] Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.421860 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.425975 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5"] Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.426889 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.431122 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-db-secret" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.437777 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-ftmbs"] Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.442611 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5"] Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.549819 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6ps\" (UniqueName: \"kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.550195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.550324 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tpw\" (UniqueName: \"kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.550426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.651573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6ps\" (UniqueName: \"kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.651633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.651692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tpw\" (UniqueName: \"kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.651708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.652373 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.652444 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.677358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6ps\" (UniqueName: \"kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps\") pod \"keystone-db-create-ftmbs\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.693337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tpw\" (UniqueName: \"kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw\") pod \"keystone-efbe-account-create-update-mdbq5\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.743489 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:48 crc kubenswrapper[4831]: I0309 16:16:48.755277 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.129348 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-949hm" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="registry-server" containerID="cri-o://d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5" gracePeriod=2 Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.208648 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5"] Mar 09 16:16:49 crc kubenswrapper[4831]: W0309 16:16:49.234037 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded68c1aa_2141_456d_8a99_e56de0d609e7.slice/crio-c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280 WatchSource:0}: Error finding container c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280: Status 404 returned error can't find the container with id c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280 Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.271267 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-ftmbs"] Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.594281 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.665481 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content\") pod \"c466dab8-0c65-4596-9151-725dff336993\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.665569 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r5h9\" (UniqueName: \"kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9\") pod \"c466dab8-0c65-4596-9151-725dff336993\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.665601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities\") pod \"c466dab8-0c65-4596-9151-725dff336993\" (UID: \"c466dab8-0c65-4596-9151-725dff336993\") " Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.666502 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities" (OuterVolumeSpecName: "utilities") pod "c466dab8-0c65-4596-9151-725dff336993" (UID: "c466dab8-0c65-4596-9151-725dff336993"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.687936 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9" (OuterVolumeSpecName: "kube-api-access-6r5h9") pod "c466dab8-0c65-4596-9151-725dff336993" (UID: "c466dab8-0c65-4596-9151-725dff336993"). InnerVolumeSpecName "kube-api-access-6r5h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.724785 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c466dab8-0c65-4596-9151-725dff336993" (UID: "c466dab8-0c65-4596-9151-725dff336993"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.767709 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r5h9\" (UniqueName: \"kubernetes.io/projected/c466dab8-0c65-4596-9151-725dff336993-kube-api-access-6r5h9\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.767744 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:49 crc kubenswrapper[4831]: I0309 16:16:49.767754 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c466dab8-0c65-4596-9151-725dff336993-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.148142 4831 generic.go:334] "Generic (PLEG): container finished" podID="c466dab8-0c65-4596-9151-725dff336993" containerID="d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5" exitCode=0 Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.148276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerDied","Data":"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.148345 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-949hm" event={"ID":"c466dab8-0c65-4596-9151-725dff336993","Type":"ContainerDied","Data":"5d051046edeba3d036e9355fbcbbc3f8d3e9bae7a2b5f0ac2a090838b6b8dc4a"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.148374 4831 scope.go:117] "RemoveContainer" containerID="d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.148365 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949hm" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.150763 4831 generic.go:334] "Generic (PLEG): container finished" podID="ed68c1aa-2141-456d-8a99-e56de0d609e7" containerID="6048b24367d4cdff9e3164e9eefef623cb19bdb41c379a939cbcdbf6ee0d3540" exitCode=0 Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.150880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" event={"ID":"ed68c1aa-2141-456d-8a99-e56de0d609e7","Type":"ContainerDied","Data":"6048b24367d4cdff9e3164e9eefef623cb19bdb41c379a939cbcdbf6ee0d3540"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.150934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" event={"ID":"ed68c1aa-2141-456d-8a99-e56de0d609e7","Type":"ContainerStarted","Data":"c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.153925 4831 generic.go:334] "Generic (PLEG): container finished" podID="248c872a-0515-49c5-a99d-5af2c4295932" containerID="9d1c6a6ef65bada3cf3d8c41d64c3ac7182a13def4fd8f9eb884c788d29088a8" exitCode=0 Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.153977 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-ftmbs" event={"ID":"248c872a-0515-49c5-a99d-5af2c4295932","Type":"ContainerDied","Data":"9d1c6a6ef65bada3cf3d8c41d64c3ac7182a13def4fd8f9eb884c788d29088a8"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.154009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-ftmbs" event={"ID":"248c872a-0515-49c5-a99d-5af2c4295932","Type":"ContainerStarted","Data":"bd5774cc263690f75641f6c989de0bca676ff7e3359c3ec24ab3df0dcc704975"} Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.189760 4831 scope.go:117] "RemoveContainer" containerID="a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.217330 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.231472 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-949hm"] Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.235300 4831 scope.go:117] "RemoveContainer" containerID="4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.263014 4831 scope.go:117] "RemoveContainer" containerID="d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5" Mar 09 16:16:50 crc kubenswrapper[4831]: E0309 16:16:50.263741 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5\": container with ID starting with d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5 not found: ID does not exist" containerID="d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.263838 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5"} err="failed to get container status \"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5\": rpc error: code = NotFound desc = could not find container \"d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5\": container with ID starting with d17d2bc0fd3f16e74d2ead48933cb41efb50e573ba9b64a266027327e0e7a4f5 not found: ID does not exist" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.263915 4831 scope.go:117] "RemoveContainer" containerID="a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30" Mar 09 16:16:50 crc kubenswrapper[4831]: E0309 16:16:50.270857 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30\": container with ID starting with a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30 not found: ID does not exist" containerID="a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.270890 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30"} err="failed to get container status \"a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30\": rpc error: code = NotFound desc = could not find container \"a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30\": container with ID starting with a65db331e0781ce0532edc79e45733112503ea433437b8660fe786e208ecaf30 not found: ID does not exist" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.270911 4831 scope.go:117] "RemoveContainer" containerID="4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605" Mar 09 16:16:50 crc kubenswrapper[4831]: E0309 16:16:50.271229 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605\": container with ID starting with 4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605 not found: ID does not exist" containerID="4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605" Mar 09 16:16:50 crc kubenswrapper[4831]: I0309 16:16:50.271249 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605"} err="failed to get container status \"4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605\": rpc error: code = NotFound desc = could not find container \"4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605\": container with ID starting with 4a0c84d8e28262186d62f506663092c686b3aae677d7706085066dd4dd76f605 not found: ID does not exist" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.484848 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.488630 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.600877 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp6ps\" (UniqueName: \"kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps\") pod \"248c872a-0515-49c5-a99d-5af2c4295932\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.601004 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5tpw\" (UniqueName: \"kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw\") pod \"ed68c1aa-2141-456d-8a99-e56de0d609e7\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.601052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts\") pod \"ed68c1aa-2141-456d-8a99-e56de0d609e7\" (UID: \"ed68c1aa-2141-456d-8a99-e56de0d609e7\") " Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.601139 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts\") pod \"248c872a-0515-49c5-a99d-5af2c4295932\" (UID: \"248c872a-0515-49c5-a99d-5af2c4295932\") " Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.602194 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "248c872a-0515-49c5-a99d-5af2c4295932" (UID: "248c872a-0515-49c5-a99d-5af2c4295932"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.602367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed68c1aa-2141-456d-8a99-e56de0d609e7" (UID: "ed68c1aa-2141-456d-8a99-e56de0d609e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.608684 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw" (OuterVolumeSpecName: "kube-api-access-k5tpw") pod "ed68c1aa-2141-456d-8a99-e56de0d609e7" (UID: "ed68c1aa-2141-456d-8a99-e56de0d609e7"). InnerVolumeSpecName "kube-api-access-k5tpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.609619 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps" (OuterVolumeSpecName: "kube-api-access-pp6ps") pod "248c872a-0515-49c5-a99d-5af2c4295932" (UID: "248c872a-0515-49c5-a99d-5af2c4295932"). InnerVolumeSpecName "kube-api-access-pp6ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.636187 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c466dab8-0c65-4596-9151-725dff336993" path="/var/lib/kubelet/pods/c466dab8-0c65-4596-9151-725dff336993/volumes" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.703841 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tpw\" (UniqueName: \"kubernetes.io/projected/ed68c1aa-2141-456d-8a99-e56de0d609e7-kube-api-access-k5tpw\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.703879 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed68c1aa-2141-456d-8a99-e56de0d609e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.703890 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248c872a-0515-49c5-a99d-5af2c4295932-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:51 crc kubenswrapper[4831]: I0309 16:16:51.703900 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp6ps\" (UniqueName: \"kubernetes.io/projected/248c872a-0515-49c5-a99d-5af2c4295932-kube-api-access-pp6ps\") on node \"crc\" DevicePath \"\"" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.169247 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" event={"ID":"ed68c1aa-2141-456d-8a99-e56de0d609e7","Type":"ContainerDied","Data":"c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280"} Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.169320 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c307fe8bdc127ffa5274269f1caa5c7d21effefc25f0ccc8717f938644d50280" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.169337 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.171897 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-ftmbs" event={"ID":"248c872a-0515-49c5-a99d-5af2c4295932","Type":"ContainerDied","Data":"bd5774cc263690f75641f6c989de0bca676ff7e3359c3ec24ab3df0dcc704975"} Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.171941 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5774cc263690f75641f6c989de0bca676ff7e3359c3ec24ab3df0dcc704975" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.172002 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-ftmbs" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.305107 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:16:52 crc kubenswrapper[4831]: E0309 16:16:52.305938 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed68c1aa-2141-456d-8a99-e56de0d609e7" containerName="mariadb-account-create-update" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.305974 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed68c1aa-2141-456d-8a99-e56de0d609e7" containerName="mariadb-account-create-update" Mar 09 16:16:52 crc kubenswrapper[4831]: E0309 16:16:52.305993 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="extract-content" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306008 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="extract-content" Mar 09 16:16:52 crc kubenswrapper[4831]: E0309 16:16:52.306030 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="registry-server" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306043 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="registry-server" Mar 09 16:16:52 crc kubenswrapper[4831]: E0309 16:16:52.306078 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248c872a-0515-49c5-a99d-5af2c4295932" containerName="mariadb-database-create" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306094 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="248c872a-0515-49c5-a99d-5af2c4295932" containerName="mariadb-database-create" Mar 09 16:16:52 crc kubenswrapper[4831]: E0309 16:16:52.306117 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="extract-utilities" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306129 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="extract-utilities" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306347 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="248c872a-0515-49c5-a99d-5af2c4295932" containerName="mariadb-database-create" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306371 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed68c1aa-2141-456d-8a99-e56de0d609e7" containerName="mariadb-account-create-update" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.306422 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c466dab8-0c65-4596-9151-725dff336993" containerName="registry-server" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.307100 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.310187 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-hzzk2" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.313272 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.413900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8j2t\" (UniqueName: \"kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t\") pod \"barbican-operator-index-wdjpz\" (UID: \"a134931b-ea1a-4219-8a40-2fa54beee370\") " pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.516120 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8j2t\" (UniqueName: \"kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t\") pod \"barbican-operator-index-wdjpz\" (UID: \"a134931b-ea1a-4219-8a40-2fa54beee370\") " pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.537383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8j2t\" (UniqueName: \"kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t\") pod \"barbican-operator-index-wdjpz\" (UID: \"a134931b-ea1a-4219-8a40-2fa54beee370\") " pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:16:52 crc kubenswrapper[4831]: I0309 16:16:52.621486 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.118248 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:16:53 crc kubenswrapper[4831]: W0309 16:16:53.138205 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda134931b_ea1a_4219_8a40_2fa54beee370.slice/crio-294a5212efc31aaee0e4743f9b1b2760e3f70c615fab0ca18fefca51aaae7eff WatchSource:0}: Error finding container 294a5212efc31aaee0e4743f9b1b2760e3f70c615fab0ca18fefca51aaae7eff: Status 404 returned error can't find the container with id 294a5212efc31aaee0e4743f9b1b2760e3f70c615fab0ca18fefca51aaae7eff Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.178266 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-wdjpz" event={"ID":"a134931b-ea1a-4219-8a40-2fa54beee370","Type":"ContainerStarted","Data":"294a5212efc31aaee0e4743f9b1b2760e3f70c615fab0ca18fefca51aaae7eff"} Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.973379 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-sync-qf6r7"] Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.974754 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.977586 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-7bqjn" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.979581 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.979753 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.979778 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 16:16:53 crc kubenswrapper[4831]: I0309 16:16:53.990958 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-qf6r7"] Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.038685 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncl9d\" (UniqueName: \"kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.038745 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.139743 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncl9d\" (UniqueName: \"kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.139800 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.146965 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.162791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncl9d\" (UniqueName: \"kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d\") pod \"keystone-db-sync-qf6r7\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.194009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-wdjpz" event={"ID":"a134931b-ea1a-4219-8a40-2fa54beee370","Type":"ContainerStarted","Data":"72d3dc4011c723b2bb63662ee5e33c768abc574da93f62494b74f17b77f106e5"} Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.217786 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-wdjpz" podStartSLOduration=1.354588519 podStartE2EDuration="2.217767687s" podCreationTimestamp="2026-03-09 16:16:52 +0000 UTC" firstStartedPulling="2026-03-09 16:16:53.146212419 +0000 UTC m=+1140.279894872" lastFinishedPulling="2026-03-09 16:16:54.009391607 +0000 UTC m=+1141.143074040" observedRunningTime="2026-03-09 16:16:54.215762489 +0000 UTC m=+1141.349444912" watchObservedRunningTime="2026-03-09 16:16:54.217767687 +0000 UTC m=+1141.351450120" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.301686 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:16:54 crc kubenswrapper[4831]: I0309 16:16:54.500292 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-qf6r7"] Mar 09 16:16:54 crc kubenswrapper[4831]: W0309 16:16:54.505359 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f55c227_3feb_4f45_a5fc_bb9adf1f5b4d.slice/crio-713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf WatchSource:0}: Error finding container 713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf: Status 404 returned error can't find the container with id 713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf Mar 09 16:16:55 crc kubenswrapper[4831]: I0309 16:16:55.205499 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" event={"ID":"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d","Type":"ContainerStarted","Data":"713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf"} Mar 09 16:16:57 crc kubenswrapper[4831]: I0309 16:16:57.494656 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:16:57 crc kubenswrapper[4831]: I0309 16:16:57.495217 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-wdjpz" podUID="a134931b-ea1a-4219-8a40-2fa54beee370" containerName="registry-server" containerID="cri-o://72d3dc4011c723b2bb63662ee5e33c768abc574da93f62494b74f17b77f106e5" gracePeriod=2 Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.098961 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-6qh4f"] Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.099815 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.127775 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-6qh4f"] Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.229156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcv2h\" (UniqueName: \"kubernetes.io/projected/16d21189-91cf-4f36-b6f3-96a240bd6167-kube-api-access-rcv2h\") pod \"barbican-operator-index-6qh4f\" (UID: \"16d21189-91cf-4f36-b6f3-96a240bd6167\") " pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.229369 4831 generic.go:334] "Generic (PLEG): container finished" podID="a134931b-ea1a-4219-8a40-2fa54beee370" containerID="72d3dc4011c723b2bb63662ee5e33c768abc574da93f62494b74f17b77f106e5" exitCode=0 Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.229415 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-wdjpz" event={"ID":"a134931b-ea1a-4219-8a40-2fa54beee370","Type":"ContainerDied","Data":"72d3dc4011c723b2bb63662ee5e33c768abc574da93f62494b74f17b77f106e5"} Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.320719 4831 scope.go:117] "RemoveContainer" containerID="b33de5dbb8383c74b2f28df0a6747886dd4da73f5ecf531181710d39a6f3fb46" Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.330131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcv2h\" (UniqueName: \"kubernetes.io/projected/16d21189-91cf-4f36-b6f3-96a240bd6167-kube-api-access-rcv2h\") pod \"barbican-operator-index-6qh4f\" (UID: \"16d21189-91cf-4f36-b6f3-96a240bd6167\") " pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.353100 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcv2h\" (UniqueName: \"kubernetes.io/projected/16d21189-91cf-4f36-b6f3-96a240bd6167-kube-api-access-rcv2h\") pod \"barbican-operator-index-6qh4f\" (UID: \"16d21189-91cf-4f36-b6f3-96a240bd6167\") " pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:16:58 crc kubenswrapper[4831]: I0309 16:16:58.421551 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.160004 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.267724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" event={"ID":"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d","Type":"ContainerStarted","Data":"106fd2db5a102ff27824b3bd79bdbb7681b1fbcbcdf970115375e28974c3867c"} Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.270924 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8j2t\" (UniqueName: \"kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t\") pod \"a134931b-ea1a-4219-8a40-2fa54beee370\" (UID: \"a134931b-ea1a-4219-8a40-2fa54beee370\") " Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.276897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t" (OuterVolumeSpecName: "kube-api-access-f8j2t") pod "a134931b-ea1a-4219-8a40-2fa54beee370" (UID: "a134931b-ea1a-4219-8a40-2fa54beee370"). InnerVolumeSpecName "kube-api-access-f8j2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.277846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-wdjpz" event={"ID":"a134931b-ea1a-4219-8a40-2fa54beee370","Type":"ContainerDied","Data":"294a5212efc31aaee0e4743f9b1b2760e3f70c615fab0ca18fefca51aaae7eff"} Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.277891 4831 scope.go:117] "RemoveContainer" containerID="72d3dc4011c723b2bb63662ee5e33c768abc574da93f62494b74f17b77f106e5" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.278004 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-wdjpz" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.296999 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" podStartSLOduration=1.711296206 podStartE2EDuration="8.296984992s" podCreationTimestamp="2026-03-09 16:16:53 +0000 UTC" firstStartedPulling="2026-03-09 16:16:54.510597246 +0000 UTC m=+1141.644279669" lastFinishedPulling="2026-03-09 16:17:01.096286002 +0000 UTC m=+1148.229968455" observedRunningTime="2026-03-09 16:17:01.29305471 +0000 UTC m=+1148.426737133" watchObservedRunningTime="2026-03-09 16:17:01.296984992 +0000 UTC m=+1148.430667415" Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.319569 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.323588 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-wdjpz"] Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.373182 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8j2t\" (UniqueName: \"kubernetes.io/projected/a134931b-ea1a-4219-8a40-2fa54beee370-kube-api-access-f8j2t\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:01 crc kubenswrapper[4831]: W0309 16:17:01.480460 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16d21189_91cf_4f36_b6f3_96a240bd6167.slice/crio-7526fd4921934eabea8d5f741967ba6505c8324913b8b6319ad5e5cb7db73063 WatchSource:0}: Error finding container 7526fd4921934eabea8d5f741967ba6505c8324913b8b6319ad5e5cb7db73063: Status 404 returned error can't find the container with id 7526fd4921934eabea8d5f741967ba6505c8324913b8b6319ad5e5cb7db73063 Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.482002 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-6qh4f"] Mar 09 16:17:01 crc kubenswrapper[4831]: I0309 16:17:01.625465 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a134931b-ea1a-4219-8a40-2fa54beee370" path="/var/lib/kubelet/pods/a134931b-ea1a-4219-8a40-2fa54beee370/volumes" Mar 09 16:17:02 crc kubenswrapper[4831]: I0309 16:17:02.287148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6qh4f" event={"ID":"16d21189-91cf-4f36-b6f3-96a240bd6167","Type":"ContainerStarted","Data":"42275d1de3b189c20bfb4ad7928d709f323dd04d413c271d1ddea6a24204810f"} Mar 09 16:17:02 crc kubenswrapper[4831]: I0309 16:17:02.287587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6qh4f" event={"ID":"16d21189-91cf-4f36-b6f3-96a240bd6167","Type":"ContainerStarted","Data":"7526fd4921934eabea8d5f741967ba6505c8324913b8b6319ad5e5cb7db73063"} Mar 09 16:17:02 crc kubenswrapper[4831]: I0309 16:17:02.318349 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-6qh4f" podStartSLOduration=3.7777690059999998 podStartE2EDuration="4.318325292s" podCreationTimestamp="2026-03-09 16:16:58 +0000 UTC" firstStartedPulling="2026-03-09 16:17:01.484310029 +0000 UTC m=+1148.617992452" lastFinishedPulling="2026-03-09 16:17:02.024866315 +0000 UTC m=+1149.158548738" observedRunningTime="2026-03-09 16:17:02.30917946 +0000 UTC m=+1149.442861893" watchObservedRunningTime="2026-03-09 16:17:02.318325292 +0000 UTC m=+1149.452007735" Mar 09 16:17:04 crc kubenswrapper[4831]: I0309 16:17:04.305541 4831 generic.go:334] "Generic (PLEG): container finished" podID="8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" containerID="106fd2db5a102ff27824b3bd79bdbb7681b1fbcbcdf970115375e28974c3867c" exitCode=0 Mar 09 16:17:04 crc kubenswrapper[4831]: I0309 16:17:04.305620 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" event={"ID":"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d","Type":"ContainerDied","Data":"106fd2db5a102ff27824b3bd79bdbb7681b1fbcbcdf970115375e28974c3867c"} Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.750753 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.843584 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data\") pod \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.843709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncl9d\" (UniqueName: \"kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d\") pod \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\" (UID: \"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d\") " Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.848919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d" (OuterVolumeSpecName: "kube-api-access-ncl9d") pod "8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" (UID: "8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d"). InnerVolumeSpecName "kube-api-access-ncl9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.884291 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data" (OuterVolumeSpecName: "config-data") pod "8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" (UID: "8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.945120 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:05 crc kubenswrapper[4831]: I0309 16:17:05.945347 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncl9d\" (UniqueName: \"kubernetes.io/projected/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d-kube-api-access-ncl9d\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.325022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" event={"ID":"8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d","Type":"ContainerDied","Data":"713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf"} Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.325089 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="713e04ed6190e5788fb99595669736260520973c8179a13823df9108181b8adf" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.325117 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-qf6r7" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.536512 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-fkx4m"] Mar 09 16:17:06 crc kubenswrapper[4831]: E0309 16:17:06.537558 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" containerName="keystone-db-sync" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.537585 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" containerName="keystone-db-sync" Mar 09 16:17:06 crc kubenswrapper[4831]: E0309 16:17:06.537623 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a134931b-ea1a-4219-8a40-2fa54beee370" containerName="registry-server" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.537632 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a134931b-ea1a-4219-8a40-2fa54beee370" containerName="registry-server" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.537787 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a134931b-ea1a-4219-8a40-2fa54beee370" containerName="registry-server" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.537804 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" containerName="keystone-db-sync" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.538380 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.540772 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.541150 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"osp-secret" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.542725 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.542900 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.543062 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-7bqjn" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.545657 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-fkx4m"] Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.655979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.656051 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.656128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjj6\" (UniqueName: \"kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.656158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.656183 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.757645 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.757701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.757787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjj6\" (UniqueName: \"kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.757816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.757844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.767992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.768334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.768691 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.768797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.790154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjj6\" (UniqueName: \"kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6\") pod \"keystone-bootstrap-fkx4m\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:06 crc kubenswrapper[4831]: I0309 16:17:06.858359 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:07 crc kubenswrapper[4831]: I0309 16:17:07.343308 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-fkx4m"] Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.344853 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" event={"ID":"3673146f-01ef-44a4-b277-1332dd810a9d","Type":"ContainerStarted","Data":"4d1151560c5239f77ef053ccc623a43ac93aafab37c00ca946c459cb485439d9"} Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.345272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" event={"ID":"3673146f-01ef-44a4-b277-1332dd810a9d","Type":"ContainerStarted","Data":"98a417e3c6b0284d85b9ce415364219d3c579c313a98f4e8a0139280af2e8c35"} Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.388004 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" podStartSLOduration=2.387990275 podStartE2EDuration="2.387990275s" podCreationTimestamp="2026-03-09 16:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:17:08.383682671 +0000 UTC m=+1155.517365094" watchObservedRunningTime="2026-03-09 16:17:08.387990275 +0000 UTC m=+1155.521672698" Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.422412 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.422461 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:17:08 crc kubenswrapper[4831]: I0309 16:17:08.477358 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:17:09 crc kubenswrapper[4831]: I0309 16:17:09.382149 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-6qh4f" Mar 09 16:17:10 crc kubenswrapper[4831]: I0309 16:17:10.363442 4831 generic.go:334] "Generic (PLEG): container finished" podID="3673146f-01ef-44a4-b277-1332dd810a9d" containerID="4d1151560c5239f77ef053ccc623a43ac93aafab37c00ca946c459cb485439d9" exitCode=0 Mar 09 16:17:10 crc kubenswrapper[4831]: I0309 16:17:10.363547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" event={"ID":"3673146f-01ef-44a4-b277-1332dd810a9d","Type":"ContainerDied","Data":"4d1151560c5239f77ef053ccc623a43ac93aafab37c00ca946c459cb485439d9"} Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.700172 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.828806 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys\") pod \"3673146f-01ef-44a4-b277-1332dd810a9d\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.828880 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrjj6\" (UniqueName: \"kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6\") pod \"3673146f-01ef-44a4-b277-1332dd810a9d\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.828999 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data\") pod \"3673146f-01ef-44a4-b277-1332dd810a9d\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.829062 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys\") pod \"3673146f-01ef-44a4-b277-1332dd810a9d\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.829099 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts\") pod \"3673146f-01ef-44a4-b277-1332dd810a9d\" (UID: \"3673146f-01ef-44a4-b277-1332dd810a9d\") " Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.834114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6" (OuterVolumeSpecName: "kube-api-access-zrjj6") pod "3673146f-01ef-44a4-b277-1332dd810a9d" (UID: "3673146f-01ef-44a4-b277-1332dd810a9d"). InnerVolumeSpecName "kube-api-access-zrjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.834508 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3673146f-01ef-44a4-b277-1332dd810a9d" (UID: "3673146f-01ef-44a4-b277-1332dd810a9d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.834608 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3673146f-01ef-44a4-b277-1332dd810a9d" (UID: "3673146f-01ef-44a4-b277-1332dd810a9d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.843589 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts" (OuterVolumeSpecName: "scripts") pod "3673146f-01ef-44a4-b277-1332dd810a9d" (UID: "3673146f-01ef-44a4-b277-1332dd810a9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.846623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data" (OuterVolumeSpecName: "config-data") pod "3673146f-01ef-44a4-b277-1332dd810a9d" (UID: "3673146f-01ef-44a4-b277-1332dd810a9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.930895 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.930983 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.931007 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.931024 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3673146f-01ef-44a4-b277-1332dd810a9d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:11 crc kubenswrapper[4831]: I0309 16:17:11.931041 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrjj6\" (UniqueName: \"kubernetes.io/projected/3673146f-01ef-44a4-b277-1332dd810a9d-kube-api-access-zrjj6\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.382864 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" event={"ID":"3673146f-01ef-44a4-b277-1332dd810a9d","Type":"ContainerDied","Data":"98a417e3c6b0284d85b9ce415364219d3c579c313a98f4e8a0139280af2e8c35"} Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.382962 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98a417e3c6b0284d85b9ce415364219d3c579c313a98f4e8a0139280af2e8c35" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.382922 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-fkx4m" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.889750 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-7f64cd86f9-mwgqm"] Mar 09 16:17:12 crc kubenswrapper[4831]: E0309 16:17:12.890047 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3673146f-01ef-44a4-b277-1332dd810a9d" containerName="keystone-bootstrap" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.890062 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3673146f-01ef-44a4-b277-1332dd810a9d" containerName="keystone-bootstrap" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.890271 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3673146f-01ef-44a4-b277-1332dd810a9d" containerName="keystone-bootstrap" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.890931 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.898723 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-7bqjn" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.898782 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.898782 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.906100 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 16:17:12 crc kubenswrapper[4831]: I0309 16:17:12.907309 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-7f64cd86f9-mwgqm"] Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.047090 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-config-data\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.047387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tk5\" (UniqueName: \"kubernetes.io/projected/a2fd146a-8317-4a78-b017-e74226b0888d-kube-api-access-l6tk5\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.047518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-credential-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.047610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-scripts\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.047761 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-fernet-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148063 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz"] Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148722 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tk5\" (UniqueName: \"kubernetes.io/projected/a2fd146a-8317-4a78-b017-e74226b0888d-kube-api-access-l6tk5\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-credential-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-scripts\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148826 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-fernet-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.148874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-config-data\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.149552 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.157974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-scripts\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.158186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-credential-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.161707 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-fernet-keys\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.157978 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.172094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fd146a-8317-4a78-b017-e74226b0888d-config-data\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.172940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz"] Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.176919 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tk5\" (UniqueName: \"kubernetes.io/projected/a2fd146a-8317-4a78-b017-e74226b0888d-kube-api-access-l6tk5\") pod \"keystone-7f64cd86f9-mwgqm\" (UID: \"a2fd146a-8317-4a78-b017-e74226b0888d\") " pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.206680 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.251041 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.251264 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.251321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlq8b\" (UniqueName: \"kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.352561 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.352705 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.352734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlq8b\" (UniqueName: \"kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.353039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.353216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.371262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlq8b\" (UniqueName: \"kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.531053 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.718014 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-7f64cd86f9-mwgqm"] Mar 09 16:17:13 crc kubenswrapper[4831]: I0309 16:17:13.939196 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz"] Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.402810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" event={"ID":"a2fd146a-8317-4a78-b017-e74226b0888d","Type":"ContainerStarted","Data":"ec17ed35e92dded9f126a80796c22fed2f22421ded5615df7d5aafcaaa36317e"} Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.402850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" event={"ID":"a2fd146a-8317-4a78-b017-e74226b0888d","Type":"ContainerStarted","Data":"7526de2de76d2a6948ec083dfb9a018b78c46b44a13b9edca57ce35f39957152"} Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.402931 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.403979 4831 generic.go:334] "Generic (PLEG): container finished" podID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerID="b5659c9dc1cd296c9904d2652648295b05ceea8a9617cc0feadaa80c8b10844f" exitCode=0 Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.404007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" event={"ID":"fcdc2307-f387-41ea-851e-2a7cb4fda4f8","Type":"ContainerDied","Data":"b5659c9dc1cd296c9904d2652648295b05ceea8a9617cc0feadaa80c8b10844f"} Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.404035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" event={"ID":"fcdc2307-f387-41ea-851e-2a7cb4fda4f8","Type":"ContainerStarted","Data":"54b9541a4bb20587134cc3a9746db62f324b6a725cddeea5ae42c7c80714064b"} Mar 09 16:17:14 crc kubenswrapper[4831]: I0309 16:17:14.424383 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" podStartSLOduration=2.424364123 podStartE2EDuration="2.424364123s" podCreationTimestamp="2026-03-09 16:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:17:14.421691436 +0000 UTC m=+1161.555373869" watchObservedRunningTime="2026-03-09 16:17:14.424364123 +0000 UTC m=+1161.558046566" Mar 09 16:17:16 crc kubenswrapper[4831]: I0309 16:17:16.429971 4831 generic.go:334] "Generic (PLEG): container finished" podID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerID="363bce330af8f072be8d436391cbb62fce243e818a4af881bc4311ceea4b632e" exitCode=0 Mar 09 16:17:16 crc kubenswrapper[4831]: I0309 16:17:16.430287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" event={"ID":"fcdc2307-f387-41ea-851e-2a7cb4fda4f8","Type":"ContainerDied","Data":"363bce330af8f072be8d436391cbb62fce243e818a4af881bc4311ceea4b632e"} Mar 09 16:17:17 crc kubenswrapper[4831]: I0309 16:17:17.439958 4831 generic.go:334] "Generic (PLEG): container finished" podID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerID="997a0591edfabee3788e82d82bf77a74350e37e097ff33efd758ffce9376f6cc" exitCode=0 Mar 09 16:17:17 crc kubenswrapper[4831]: I0309 16:17:17.440095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" event={"ID":"fcdc2307-f387-41ea-851e-2a7cb4fda4f8","Type":"ContainerDied","Data":"997a0591edfabee3788e82d82bf77a74350e37e097ff33efd758ffce9376f6cc"} Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.824530 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.927302 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle\") pod \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.927378 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlq8b\" (UniqueName: \"kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b\") pod \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.927485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util\") pod \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\" (UID: \"fcdc2307-f387-41ea-851e-2a7cb4fda4f8\") " Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.928762 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle" (OuterVolumeSpecName: "bundle") pod "fcdc2307-f387-41ea-851e-2a7cb4fda4f8" (UID: "fcdc2307-f387-41ea-851e-2a7cb4fda4f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.935542 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b" (OuterVolumeSpecName: "kube-api-access-vlq8b") pod "fcdc2307-f387-41ea-851e-2a7cb4fda4f8" (UID: "fcdc2307-f387-41ea-851e-2a7cb4fda4f8"). InnerVolumeSpecName "kube-api-access-vlq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:18 crc kubenswrapper[4831]: I0309 16:17:18.943071 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util" (OuterVolumeSpecName: "util") pod "fcdc2307-f387-41ea-851e-2a7cb4fda4f8" (UID: "fcdc2307-f387-41ea-851e-2a7cb4fda4f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.029650 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.029718 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlq8b\" (UniqueName: \"kubernetes.io/projected/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-kube-api-access-vlq8b\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.029747 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcdc2307-f387-41ea-851e-2a7cb4fda4f8-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.475784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" event={"ID":"fcdc2307-f387-41ea-851e-2a7cb4fda4f8","Type":"ContainerDied","Data":"54b9541a4bb20587134cc3a9746db62f324b6a725cddeea5ae42c7c80714064b"} Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.475831 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b9541a4bb20587134cc3a9746db62f324b6a725cddeea5ae42c7c80714064b" Mar 09 16:17:19 crc kubenswrapper[4831]: I0309 16:17:19.475951 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.935006 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw"] Mar 09 16:17:30 crc kubenswrapper[4831]: E0309 16:17:30.935725 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="extract" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.935736 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="extract" Mar 09 16:17:30 crc kubenswrapper[4831]: E0309 16:17:30.935754 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="pull" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.935760 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="pull" Mar 09 16:17:30 crc kubenswrapper[4831]: E0309 16:17:30.935773 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="util" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.935779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="util" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.935889 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcdc2307-f387-41ea-851e-2a7cb4fda4f8" containerName="extract" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.936315 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.940687 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-75wnb" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.940733 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Mar 09 16:17:30 crc kubenswrapper[4831]: I0309 16:17:30.952014 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw"] Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.121202 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrntq\" (UniqueName: \"kubernetes.io/projected/eeb63698-ca7a-4598-a6b3-11e1fca09406-kube-api-access-jrntq\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.121267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-apiservice-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.121341 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-webhook-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.222421 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-webhook-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.222472 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrntq\" (UniqueName: \"kubernetes.io/projected/eeb63698-ca7a-4598-a6b3-11e1fca09406-kube-api-access-jrntq\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.222508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-apiservice-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.227698 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-apiservice-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.228314 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeb63698-ca7a-4598-a6b3-11e1fca09406-webhook-cert\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.249253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrntq\" (UniqueName: \"kubernetes.io/projected/eeb63698-ca7a-4598-a6b3-11e1fca09406-kube-api-access-jrntq\") pod \"barbican-operator-controller-manager-64ff7758ff-l4vrw\" (UID: \"eeb63698-ca7a-4598-a6b3-11e1fca09406\") " pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.260481 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:31 crc kubenswrapper[4831]: I0309 16:17:31.682507 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw"] Mar 09 16:17:32 crc kubenswrapper[4831]: I0309 16:17:32.583390 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" event={"ID":"eeb63698-ca7a-4598-a6b3-11e1fca09406","Type":"ContainerStarted","Data":"404d5c3a235ce9c8c85cca1a10f46cd51b367f22452b9d70a337ba77645f0c6a"} Mar 09 16:17:33 crc kubenswrapper[4831]: I0309 16:17:33.019036 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:17:33 crc kubenswrapper[4831]: I0309 16:17:33.019110 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:17:34 crc kubenswrapper[4831]: I0309 16:17:34.597221 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" event={"ID":"eeb63698-ca7a-4598-a6b3-11e1fca09406","Type":"ContainerStarted","Data":"f48c3106c396e8cd92ba37724450db230bc4a5fda64d1418a3d5c9286ef600b9"} Mar 09 16:17:34 crc kubenswrapper[4831]: I0309 16:17:34.598551 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:34 crc kubenswrapper[4831]: I0309 16:17:34.618645 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" podStartSLOduration=2.324198104 podStartE2EDuration="4.618615864s" podCreationTimestamp="2026-03-09 16:17:30 +0000 UTC" firstStartedPulling="2026-03-09 16:17:31.698167319 +0000 UTC m=+1178.831849742" lastFinishedPulling="2026-03-09 16:17:33.992585079 +0000 UTC m=+1181.126267502" observedRunningTime="2026-03-09 16:17:34.61218652 +0000 UTC m=+1181.745868973" watchObservedRunningTime="2026-03-09 16:17:34.618615864 +0000 UTC m=+1181.752298307" Mar 09 16:17:41 crc kubenswrapper[4831]: I0309 16:17:41.267583 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64ff7758ff-l4vrw" Mar 09 16:17:44 crc kubenswrapper[4831]: I0309 16:17:44.685211 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/keystone-7f64cd86f9-mwgqm" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.486447 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-create-krp95"] Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.487885 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.496780 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf"] Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.497695 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.500744 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-db-secret" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.502565 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-krp95"] Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.530967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf"] Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.588265 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpjl\" (UniqueName: \"kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.588352 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.588386 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.588417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqvr\" (UniqueName: \"kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.690071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpjl\" (UniqueName: \"kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.690163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.690214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.690232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqvr\" (UniqueName: \"kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.691044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.691066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.727349 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpjl\" (UniqueName: \"kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl\") pod \"barbican-db-create-krp95\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.732989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqvr\" (UniqueName: \"kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr\") pod \"barbican-71a7-account-create-update-mbwsf\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.810887 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:50 crc kubenswrapper[4831]: I0309 16:17:50.825374 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.252674 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-krp95"] Mar 09 16:17:51 crc kubenswrapper[4831]: W0309 16:17:51.254425 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa19489_9820_4ed2_8e45_d829182a9b07.slice/crio-02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca WatchSource:0}: Error finding container 02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca: Status 404 returned error can't find the container with id 02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.316681 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf"] Mar 09 16:17:51 crc kubenswrapper[4831]: W0309 16:17:51.324932 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice/crio-93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13 WatchSource:0}: Error finding container 93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13: Status 404 returned error can't find the container with id 93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13 Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.731039 4831 generic.go:334] "Generic (PLEG): container finished" podID="9aa19489-9820-4ed2-8e45-d829182a9b07" containerID="4199b7a2ccab904a1c5a4d2edc36cf27a1b80580507e80a102596a2d7ad0b319" exitCode=0 Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.731133 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-krp95" event={"ID":"9aa19489-9820-4ed2-8e45-d829182a9b07","Type":"ContainerDied","Data":"4199b7a2ccab904a1c5a4d2edc36cf27a1b80580507e80a102596a2d7ad0b319"} Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.731165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-krp95" event={"ID":"9aa19489-9820-4ed2-8e45-d829182a9b07","Type":"ContainerStarted","Data":"02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca"} Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.734103 4831 generic.go:334] "Generic (PLEG): container finished" podID="85b2ff70-bfd2-41cd-a0f7-3922fb99b379" containerID="144ab3d27c412cb08d01b60095af350326339bb75d2c84434196eb25b8dcaf9f" exitCode=0 Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.734142 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" event={"ID":"85b2ff70-bfd2-41cd-a0f7-3922fb99b379","Type":"ContainerDied","Data":"144ab3d27c412cb08d01b60095af350326339bb75d2c84434196eb25b8dcaf9f"} Mar 09 16:17:51 crc kubenswrapper[4831]: I0309 16:17:51.734166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" event={"ID":"85b2ff70-bfd2-41cd-a0f7-3922fb99b379","Type":"ContainerStarted","Data":"93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13"} Mar 09 16:17:52 crc kubenswrapper[4831]: I0309 16:17:52.904150 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-nq5bh"] Mar 09 16:17:52 crc kubenswrapper[4831]: I0309 16:17:52.908271 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:17:52 crc kubenswrapper[4831]: I0309 16:17:52.911713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-drtg7" Mar 09 16:17:52 crc kubenswrapper[4831]: I0309 16:17:52.915024 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-nq5bh"] Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.027629 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlv2\" (UniqueName: \"kubernetes.io/projected/4dc14231-9873-414d-8a81-5c5b4857bde9-kube-api-access-bhlv2\") pod \"swift-operator-index-nq5bh\" (UID: \"4dc14231-9873-414d-8a81-5c5b4857bde9\") " pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.098392 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.100985 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.129866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlv2\" (UniqueName: \"kubernetes.io/projected/4dc14231-9873-414d-8a81-5c5b4857bde9-kube-api-access-bhlv2\") pod \"swift-operator-index-nq5bh\" (UID: \"4dc14231-9873-414d-8a81-5c5b4857bde9\") " pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.150090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlv2\" (UniqueName: \"kubernetes.io/projected/4dc14231-9873-414d-8a81-5c5b4857bde9-kube-api-access-bhlv2\") pod \"swift-operator-index-nq5bh\" (UID: \"4dc14231-9873-414d-8a81-5c5b4857bde9\") " pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.231301 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgpjl\" (UniqueName: \"kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl\") pod \"9aa19489-9820-4ed2-8e45-d829182a9b07\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.231365 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghqvr\" (UniqueName: \"kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr\") pod \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.231455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts\") pod \"9aa19489-9820-4ed2-8e45-d829182a9b07\" (UID: \"9aa19489-9820-4ed2-8e45-d829182a9b07\") " Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.231505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts\") pod \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\" (UID: \"85b2ff70-bfd2-41cd-a0f7-3922fb99b379\") " Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.232291 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85b2ff70-bfd2-41cd-a0f7-3922fb99b379" (UID: "85b2ff70-bfd2-41cd-a0f7-3922fb99b379"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.232386 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9aa19489-9820-4ed2-8e45-d829182a9b07" (UID: "9aa19489-9820-4ed2-8e45-d829182a9b07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.234417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr" (OuterVolumeSpecName: "kube-api-access-ghqvr") pod "85b2ff70-bfd2-41cd-a0f7-3922fb99b379" (UID: "85b2ff70-bfd2-41cd-a0f7-3922fb99b379"). InnerVolumeSpecName "kube-api-access-ghqvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.235920 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl" (OuterVolumeSpecName: "kube-api-access-rgpjl") pod "9aa19489-9820-4ed2-8e45-d829182a9b07" (UID: "9aa19489-9820-4ed2-8e45-d829182a9b07"). InnerVolumeSpecName "kube-api-access-rgpjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.236091 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.335154 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghqvr\" (UniqueName: \"kubernetes.io/projected/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-kube-api-access-ghqvr\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.335180 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa19489-9820-4ed2-8e45-d829182a9b07-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.335189 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85b2ff70-bfd2-41cd-a0f7-3922fb99b379-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.335197 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgpjl\" (UniqueName: \"kubernetes.io/projected/9aa19489-9820-4ed2-8e45-d829182a9b07-kube-api-access-rgpjl\") on node \"crc\" DevicePath \"\"" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.659814 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-nq5bh"] Mar 09 16:17:53 crc kubenswrapper[4831]: W0309 16:17:53.662181 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc14231_9873_414d_8a81_5c5b4857bde9.slice/crio-44e914226a5c6e15cb92629fb14c12c9ef24ecbf550126cf97dc512d36258d42 WatchSource:0}: Error finding container 44e914226a5c6e15cb92629fb14c12c9ef24ecbf550126cf97dc512d36258d42: Status 404 returned error can't find the container with id 44e914226a5c6e15cb92629fb14c12c9ef24ecbf550126cf97dc512d36258d42 Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.748095 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-krp95" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.748074 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-krp95" event={"ID":"9aa19489-9820-4ed2-8e45-d829182a9b07","Type":"ContainerDied","Data":"02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca"} Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.748311 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02494ffbb795b1f184d74972f7dddf2c06cac946c9dbb78f760e736c42b187ca" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.749813 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-nq5bh" event={"ID":"4dc14231-9873-414d-8a81-5c5b4857bde9","Type":"ContainerStarted","Data":"44e914226a5c6e15cb92629fb14c12c9ef24ecbf550126cf97dc512d36258d42"} Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.752458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" event={"ID":"85b2ff70-bfd2-41cd-a0f7-3922fb99b379","Type":"ContainerDied","Data":"93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13"} Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.752483 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b6d4d80f7cd08fe27dd7154f3bdba90e44fa57154d8439ab39db80fd056c13" Mar 09 16:17:53 crc kubenswrapper[4831]: I0309 16:17:53.752531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.742614 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-sync-mw2df"] Mar 09 16:17:55 crc kubenswrapper[4831]: E0309 16:17:55.743350 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa19489-9820-4ed2-8e45-d829182a9b07" containerName="mariadb-database-create" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.743367 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa19489-9820-4ed2-8e45-d829182a9b07" containerName="mariadb-database-create" Mar 09 16:17:55 crc kubenswrapper[4831]: E0309 16:17:55.743379 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b2ff70-bfd2-41cd-a0f7-3922fb99b379" containerName="mariadb-account-create-update" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.743387 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b2ff70-bfd2-41cd-a0f7-3922fb99b379" containerName="mariadb-account-create-update" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.743769 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b2ff70-bfd2-41cd-a0f7-3922fb99b379" containerName="mariadb-account-create-update" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.743786 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa19489-9820-4ed2-8e45-d829182a9b07" containerName="mariadb-database-create" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.744547 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.747648 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-hrchz" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.747743 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.748864 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-mw2df"] Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.894208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.894278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.995973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:55 crc kubenswrapper[4831]: I0309 16:17:55.996061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.000662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.010757 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk\") pod \"barbican-db-sync-mw2df\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.107783 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.771088 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-mw2df"] Mar 09 16:17:56 crc kubenswrapper[4831]: W0309 16:17:56.772870 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819181c8_0e88_4a97_b3ee_c6add60e4053.slice/crio-a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b WatchSource:0}: Error finding container a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b: Status 404 returned error can't find the container with id a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.809350 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-mw2df" event={"ID":"819181c8-0e88-4a97-b3ee-c6add60e4053","Type":"ContainerStarted","Data":"a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b"} Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.813544 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-nq5bh" event={"ID":"4dc14231-9873-414d-8a81-5c5b4857bde9","Type":"ContainerStarted","Data":"210997fcfc394f5082ca6b249fa826cc329c909ecce48a2b1f6cb967f9b39222"} Mar 09 16:17:56 crc kubenswrapper[4831]: I0309 16:17:56.836242 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-nq5bh" podStartSLOduration=2.100765415 podStartE2EDuration="4.836216539s" podCreationTimestamp="2026-03-09 16:17:52 +0000 UTC" firstStartedPulling="2026-03-09 16:17:53.665250168 +0000 UTC m=+1200.798932591" lastFinishedPulling="2026-03-09 16:17:56.400701252 +0000 UTC m=+1203.534383715" observedRunningTime="2026-03-09 16:17:56.825746679 +0000 UTC m=+1203.959429122" watchObservedRunningTime="2026-03-09 16:17:56.836216539 +0000 UTC m=+1203.969898972" Mar 09 16:17:57 crc kubenswrapper[4831]: E0309 16:17:57.952337 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.158565 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551218-wg5rb"] Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.159734 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.163910 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551218-wg5rb"] Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.164335 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.164420 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.164539 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.262108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ncj\" (UniqueName: \"kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj\") pod \"auto-csr-approver-29551218-wg5rb\" (UID: \"ae125263-a50a-4c83-b1aa-c6712ccb28ce\") " pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.363435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ncj\" (UniqueName: \"kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj\") pod \"auto-csr-approver-29551218-wg5rb\" (UID: \"ae125263-a50a-4c83-b1aa-c6712ccb28ce\") " pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.384911 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ncj\" (UniqueName: \"kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj\") pod \"auto-csr-approver-29551218-wg5rb\" (UID: \"ae125263-a50a-4c83-b1aa-c6712ccb28ce\") " pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:00 crc kubenswrapper[4831]: I0309 16:18:00.478610 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:01 crc kubenswrapper[4831]: I0309 16:18:01.534105 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551218-wg5rb"] Mar 09 16:18:01 crc kubenswrapper[4831]: W0309 16:18:01.542298 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae125263_a50a_4c83_b1aa_c6712ccb28ce.slice/crio-dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d WatchSource:0}: Error finding container dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d: Status 404 returned error can't find the container with id dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d Mar 09 16:18:01 crc kubenswrapper[4831]: I0309 16:18:01.878299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" event={"ID":"ae125263-a50a-4c83-b1aa-c6712ccb28ce","Type":"ContainerStarted","Data":"dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d"} Mar 09 16:18:01 crc kubenswrapper[4831]: I0309 16:18:01.879577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-mw2df" event={"ID":"819181c8-0e88-4a97-b3ee-c6add60e4053","Type":"ContainerStarted","Data":"edb577aa83de7b86a7c1afebd79d2d607c1e1174b366b86351172666800190e2"} Mar 09 16:18:01 crc kubenswrapper[4831]: I0309 16:18:01.896552 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-db-sync-mw2df" podStartSLOduration=2.415500605 podStartE2EDuration="6.896535226s" podCreationTimestamp="2026-03-09 16:17:55 +0000 UTC" firstStartedPulling="2026-03-09 16:17:56.775614363 +0000 UTC m=+1203.909296786" lastFinishedPulling="2026-03-09 16:18:01.256648984 +0000 UTC m=+1208.390331407" observedRunningTime="2026-03-09 16:18:01.892888951 +0000 UTC m=+1209.026571384" watchObservedRunningTime="2026-03-09 16:18:01.896535226 +0000 UTC m=+1209.030217659" Mar 09 16:18:02 crc kubenswrapper[4831]: I0309 16:18:02.888783 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" event={"ID":"ae125263-a50a-4c83-b1aa-c6712ccb28ce","Type":"ContainerStarted","Data":"5c27ca727ab5980028d7b3c92e8b386f2fd0695fcdc49f10222ed0aec5d8f251"} Mar 09 16:18:02 crc kubenswrapper[4831]: I0309 16:18:02.908342 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" podStartSLOduration=1.926087972 podStartE2EDuration="2.908318931s" podCreationTimestamp="2026-03-09 16:18:00 +0000 UTC" firstStartedPulling="2026-03-09 16:18:01.54481877 +0000 UTC m=+1208.678501193" lastFinishedPulling="2026-03-09 16:18:02.527049689 +0000 UTC m=+1209.660732152" observedRunningTime="2026-03-09 16:18:02.903589616 +0000 UTC m=+1210.037272039" watchObservedRunningTime="2026-03-09 16:18:02.908318931 +0000 UTC m=+1210.042001354" Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.018823 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.018886 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.237215 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.237264 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.270810 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.896953 4831 generic.go:334] "Generic (PLEG): container finished" podID="ae125263-a50a-4c83-b1aa-c6712ccb28ce" containerID="5c27ca727ab5980028d7b3c92e8b386f2fd0695fcdc49f10222ed0aec5d8f251" exitCode=0 Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.897030 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" event={"ID":"ae125263-a50a-4c83-b1aa-c6712ccb28ce","Type":"ContainerDied","Data":"5c27ca727ab5980028d7b3c92e8b386f2fd0695fcdc49f10222ed0aec5d8f251"} Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.899200 4831 generic.go:334] "Generic (PLEG): container finished" podID="819181c8-0e88-4a97-b3ee-c6add60e4053" containerID="edb577aa83de7b86a7c1afebd79d2d607c1e1174b366b86351172666800190e2" exitCode=0 Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.899318 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-mw2df" event={"ID":"819181c8-0e88-4a97-b3ee-c6add60e4053","Type":"ContainerDied","Data":"edb577aa83de7b86a7c1afebd79d2d607c1e1174b366b86351172666800190e2"} Mar 09 16:18:03 crc kubenswrapper[4831]: I0309 16:18:03.939346 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-nq5bh" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.175522 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.304240 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.338419 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5ncj\" (UniqueName: \"kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj\") pod \"ae125263-a50a-4c83-b1aa-c6712ccb28ce\" (UID: \"ae125263-a50a-4c83-b1aa-c6712ccb28ce\") " Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.347548 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj" (OuterVolumeSpecName: "kube-api-access-q5ncj") pod "ae125263-a50a-4c83-b1aa-c6712ccb28ce" (UID: "ae125263-a50a-4c83-b1aa-c6712ccb28ce"). InnerVolumeSpecName "kube-api-access-q5ncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.440127 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk\") pod \"819181c8-0e88-4a97-b3ee-c6add60e4053\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.440164 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data\") pod \"819181c8-0e88-4a97-b3ee-c6add60e4053\" (UID: \"819181c8-0e88-4a97-b3ee-c6add60e4053\") " Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.440568 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5ncj\" (UniqueName: \"kubernetes.io/projected/ae125263-a50a-4c83-b1aa-c6712ccb28ce-kube-api-access-q5ncj\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.443752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "819181c8-0e88-4a97-b3ee-c6add60e4053" (UID: "819181c8-0e88-4a97-b3ee-c6add60e4053"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.443936 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk" (OuterVolumeSpecName: "kube-api-access-9c4nk") pod "819181c8-0e88-4a97-b3ee-c6add60e4053" (UID: "819181c8-0e88-4a97-b3ee-c6add60e4053"). InnerVolumeSpecName "kube-api-access-9c4nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.542199 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/819181c8-0e88-4a97-b3ee-c6add60e4053-kube-api-access-9c4nk\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.542249 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/819181c8-0e88-4a97-b3ee-c6add60e4053-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.914155 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" event={"ID":"ae125263-a50a-4c83-b1aa-c6712ccb28ce","Type":"ContainerDied","Data":"dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d"} Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.914195 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551218-wg5rb" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.914204 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc021bf419cc210d8698ba7db375e0d2126c88d9177bbb99038a8e017ade0a4d" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.916494 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-mw2df" event={"ID":"819181c8-0e88-4a97-b3ee-c6add60e4053","Type":"ContainerDied","Data":"a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b"} Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.916543 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3317c87d9fe650251bd4c72defa1b3de67f559d152e03d1cd712712216c423b" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.916563 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-mw2df" Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.958127 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551212-j58bc"] Mar 09 16:18:05 crc kubenswrapper[4831]: I0309 16:18:05.980568 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551212-j58bc"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.141368 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p"] Mar 09 16:18:06 crc kubenswrapper[4831]: E0309 16:18:06.141652 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819181c8-0e88-4a97-b3ee-c6add60e4053" containerName="barbican-db-sync" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.141669 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="819181c8-0e88-4a97-b3ee-c6add60e4053" containerName="barbican-db-sync" Mar 09 16:18:06 crc kubenswrapper[4831]: E0309 16:18:06.141680 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae125263-a50a-4c83-b1aa-c6712ccb28ce" containerName="oc" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.141687 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae125263-a50a-4c83-b1aa-c6712ccb28ce" containerName="oc" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.141817 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="819181c8-0e88-4a97-b3ee-c6add60e4053" containerName="barbican-db-sync" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.141832 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae125263-a50a-4c83-b1aa-c6712ccb28ce" containerName="oc" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.142474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.146970 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-worker-config-data" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.148786 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-hrchz" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.155575 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.155743 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.193459 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.196211 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.199913 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-keystone-listener-config-data" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.211690 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.252462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data-custom\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.252788 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcplw\" (UniqueName: \"kubernetes.io/projected/54ed23b1-477a-40b1-830f-67a11831d1e8-kube-api-access-dcplw\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.252821 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.252879 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ed23b1-477a-40b1-830f-67a11831d1e8-logs\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.301359 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.302731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.304958 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-api-config-data" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.318110 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs"] Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.354447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67fdd93-c613-4b21-aed2-f6163d5405b3-logs\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.354641 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.355557 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data-custom\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.355683 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlx66\" (UniqueName: \"kubernetes.io/projected/a67fdd93-c613-4b21-aed2-f6163d5405b3-kube-api-access-jlx66\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.355761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcplw\" (UniqueName: \"kubernetes.io/projected/54ed23b1-477a-40b1-830f-67a11831d1e8-kube-api-access-dcplw\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.355855 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.356053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data-custom\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.356165 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ed23b1-477a-40b1-830f-67a11831d1e8-logs\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.356788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ed23b1-477a-40b1-830f-67a11831d1e8-logs\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.359637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data-custom\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.360125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed23b1-477a-40b1-830f-67a11831d1e8-config-data\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.374288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcplw\" (UniqueName: \"kubernetes.io/projected/54ed23b1-477a-40b1-830f-67a11831d1e8-kube-api-access-dcplw\") pod \"barbican-worker-564b98bb67-5cv2p\" (UID: \"54ed23b1-477a-40b1-830f-67a11831d1e8\") " pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/215f54de-45d6-469b-bfec-3a77a1e3d3d8-kube-api-access-x5mbl\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457600 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlx66\" (UniqueName: \"kubernetes.io/projected/a67fdd93-c613-4b21-aed2-f6163d5405b3-kube-api-access-jlx66\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data-custom\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data-custom\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457694 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67fdd93-c613-4b21-aed2-f6163d5405b3-logs\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.457790 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f54de-45d6-469b-bfec-3a77a1e3d3d8-logs\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.458145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67fdd93-c613-4b21-aed2-f6163d5405b3-logs\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.459442 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.462230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data-custom\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.472756 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67fdd93-c613-4b21-aed2-f6163d5405b3-config-data\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.481097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlx66\" (UniqueName: \"kubernetes.io/projected/a67fdd93-c613-4b21-aed2-f6163d5405b3-kube-api-access-jlx66\") pod \"barbican-keystone-listener-5cb45fbbd4-rg7r5\" (UID: \"a67fdd93-c613-4b21-aed2-f6163d5405b3\") " pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.509429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.559438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data-custom\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.559732 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.559770 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f54de-45d6-469b-bfec-3a77a1e3d3d8-logs\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.559801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/215f54de-45d6-469b-bfec-3a77a1e3d3d8-kube-api-access-x5mbl\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.561515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f54de-45d6-469b-bfec-3a77a1e3d3d8-logs\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.568268 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data-custom\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.569631 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f54de-45d6-469b-bfec-3a77a1e3d3d8-config-data\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.586599 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/215f54de-45d6-469b-bfec-3a77a1e3d3d8-kube-api-access-x5mbl\") pod \"barbican-api-f95dc6db4-9zfcs\" (UID: \"215f54de-45d6-469b-bfec-3a77a1e3d3d8\") " pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.616227 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.899034 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p"] Mar 09 16:18:06 crc kubenswrapper[4831]: W0309 16:18:06.905178 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ed23b1_477a_40b1_830f_67a11831d1e8.slice/crio-636e554e2538812416dd07656810e2e2a03abb506cccbe81525ee9ed65341bf0 WatchSource:0}: Error finding container 636e554e2538812416dd07656810e2e2a03abb506cccbe81525ee9ed65341bf0: Status 404 returned error can't find the container with id 636e554e2538812416dd07656810e2e2a03abb506cccbe81525ee9ed65341bf0 Mar 09 16:18:06 crc kubenswrapper[4831]: I0309 16:18:06.925168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" event={"ID":"54ed23b1-477a-40b1-830f-67a11831d1e8","Type":"ContainerStarted","Data":"636e554e2538812416dd07656810e2e2a03abb506cccbe81525ee9ed65341bf0"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.037234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5"] Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.080976 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs"] Mar 09 16:18:07 crc kubenswrapper[4831]: W0309 16:18:07.083682 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod215f54de_45d6_469b_bfec_3a77a1e3d3d8.slice/crio-0b0ce3986c7d32ad0568c39cefb2c6e07841265949e28be7fbc746f47509ebd8 WatchSource:0}: Error finding container 0b0ce3986c7d32ad0568c39cefb2c6e07841265949e28be7fbc746f47509ebd8: Status 404 returned error can't find the container with id 0b0ce3986c7d32ad0568c39cefb2c6e07841265949e28be7fbc746f47509ebd8 Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.143891 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9"] Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.145046 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.147249 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djwsq" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.159157 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9"] Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.269087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.269149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.269238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmncv\" (UniqueName: \"kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.370617 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmncv\" (UniqueName: \"kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.371110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.371194 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.371554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.371623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.389611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmncv\" (UniqueName: \"kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv\") pod \"3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.510759 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.630485 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bee73df-f272-4e5f-9879-3c0ded43f3ab" path="/var/lib/kubelet/pods/3bee73df-f272-4e5f-9879-3c0ded43f3ab/volumes" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.710605 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9"] Mar 09 16:18:07 crc kubenswrapper[4831]: W0309 16:18:07.719236 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139b863f_62d1_47a8_b4ec_e39d769d02ac.slice/crio-1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94 WatchSource:0}: Error finding container 1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94: Status 404 returned error can't find the container with id 1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94 Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.940326 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" event={"ID":"215f54de-45d6-469b-bfec-3a77a1e3d3d8","Type":"ContainerStarted","Data":"bfb83f54558cf98ff50674928f6a959af8e0a79b4289ddf9a0ef99c3a79eb561"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.940369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" event={"ID":"215f54de-45d6-469b-bfec-3a77a1e3d3d8","Type":"ContainerStarted","Data":"4d2c38411d7ce9d42ec896951bd9f27c0da4b2f4cd4813a00fa4a321ee5645f9"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.940379 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" event={"ID":"215f54de-45d6-469b-bfec-3a77a1e3d3d8","Type":"ContainerStarted","Data":"0b0ce3986c7d32ad0568c39cefb2c6e07841265949e28be7fbc746f47509ebd8"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.941037 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.941259 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.945816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerStarted","Data":"429793b904afb4f75e9f971298b821f5388e07c33e437d09cfc27baf14e7df7c"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.945856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerStarted","Data":"1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.948317 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" event={"ID":"a67fdd93-c613-4b21-aed2-f6163d5405b3","Type":"ContainerStarted","Data":"ef95d594aa85f3ce4a24ca8f4c1743282f52a83dbf024c8b8695e2da2fee9323"} Mar 09 16:18:07 crc kubenswrapper[4831]: I0309 16:18:07.961261 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" podStartSLOduration=1.9612465449999998 podStartE2EDuration="1.961246545s" podCreationTimestamp="2026-03-09 16:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:18:07.960118013 +0000 UTC m=+1215.093800436" watchObservedRunningTime="2026-03-09 16:18:07.961246545 +0000 UTC m=+1215.094928968" Mar 09 16:18:08 crc kubenswrapper[4831]: E0309 16:18:08.120844 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139b863f_62d1_47a8_b4ec_e39d769d02ac.slice/crio-conmon-429793b904afb4f75e9f971298b821f5388e07c33e437d09cfc27baf14e7df7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139b863f_62d1_47a8_b4ec_e39d769d02ac.slice/crio-429793b904afb4f75e9f971298b821f5388e07c33e437d09cfc27baf14e7df7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.957811 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" event={"ID":"a67fdd93-c613-4b21-aed2-f6163d5405b3","Type":"ContainerStarted","Data":"e0d9bdcd98847aa29f129f6e13a7d84c45c70d9784d930f4aa7dcf0bda99e208"} Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.958375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" event={"ID":"a67fdd93-c613-4b21-aed2-f6163d5405b3","Type":"ContainerStarted","Data":"2794eccb77770dd98b3669e0f17054e21a23ecd592bc8568d432414d412c6405"} Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.960056 4831 generic.go:334] "Generic (PLEG): container finished" podID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerID="429793b904afb4f75e9f971298b821f5388e07c33e437d09cfc27baf14e7df7c" exitCode=0 Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.960132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerDied","Data":"429793b904afb4f75e9f971298b821f5388e07c33e437d09cfc27baf14e7df7c"} Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.961844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" event={"ID":"54ed23b1-477a-40b1-830f-67a11831d1e8","Type":"ContainerStarted","Data":"b97fd6ec6ccdd9d7da32ea64c985e965ee5fae252bfaecf8c600f5b6aabe02de"} Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.961883 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" event={"ID":"54ed23b1-477a-40b1-830f-67a11831d1e8","Type":"ContainerStarted","Data":"af25789fb976797802983b4773fdfd8baba5e3ad4987d879b49dd4d9ed557294"} Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.974319 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-keystone-listener-5cb45fbbd4-rg7r5" podStartSLOduration=1.471644739 podStartE2EDuration="2.974302637s" podCreationTimestamp="2026-03-09 16:18:06 +0000 UTC" firstStartedPulling="2026-03-09 16:18:07.040610781 +0000 UTC m=+1214.174293204" lastFinishedPulling="2026-03-09 16:18:08.543268679 +0000 UTC m=+1215.676951102" observedRunningTime="2026-03-09 16:18:08.970839788 +0000 UTC m=+1216.104522221" watchObservedRunningTime="2026-03-09 16:18:08.974302637 +0000 UTC m=+1216.107985070" Mar 09 16:18:08 crc kubenswrapper[4831]: I0309 16:18:08.993117 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-worker-564b98bb67-5cv2p" podStartSLOduration=1.360729452 podStartE2EDuration="2.993075505s" podCreationTimestamp="2026-03-09 16:18:06 +0000 UTC" firstStartedPulling="2026-03-09 16:18:06.907190359 +0000 UTC m=+1214.040872782" lastFinishedPulling="2026-03-09 16:18:08.539536412 +0000 UTC m=+1215.673218835" observedRunningTime="2026-03-09 16:18:08.987963779 +0000 UTC m=+1216.121646212" watchObservedRunningTime="2026-03-09 16:18:08.993075505 +0000 UTC m=+1216.126757928" Mar 09 16:18:09 crc kubenswrapper[4831]: I0309 16:18:09.970392 4831 generic.go:334] "Generic (PLEG): container finished" podID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerID="413079e41169cb98e680a8070e0fcd954ce5f1bf7900976733b400d49ac35abd" exitCode=0 Mar 09 16:18:09 crc kubenswrapper[4831]: I0309 16:18:09.970431 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerDied","Data":"413079e41169cb98e680a8070e0fcd954ce5f1bf7900976733b400d49ac35abd"} Mar 09 16:18:10 crc kubenswrapper[4831]: I0309 16:18:10.980980 4831 generic.go:334] "Generic (PLEG): container finished" podID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerID="ba2661b9e56b20e271ab593d2a230e305c97b62661165bea097c5d29535dbf96" exitCode=0 Mar 09 16:18:10 crc kubenswrapper[4831]: I0309 16:18:10.981047 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerDied","Data":"ba2661b9e56b20e271ab593d2a230e305c97b62661165bea097c5d29535dbf96"} Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.267265 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.453228 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmncv\" (UniqueName: \"kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv\") pod \"139b863f-62d1-47a8-b4ec-e39d769d02ac\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.453359 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle\") pod \"139b863f-62d1-47a8-b4ec-e39d769d02ac\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.453593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util\") pod \"139b863f-62d1-47a8-b4ec-e39d769d02ac\" (UID: \"139b863f-62d1-47a8-b4ec-e39d769d02ac\") " Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.454825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle" (OuterVolumeSpecName: "bundle") pod "139b863f-62d1-47a8-b4ec-e39d769d02ac" (UID: "139b863f-62d1-47a8-b4ec-e39d769d02ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.456447 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.461661 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv" (OuterVolumeSpecName: "kube-api-access-wmncv") pod "139b863f-62d1-47a8-b4ec-e39d769d02ac" (UID: "139b863f-62d1-47a8-b4ec-e39d769d02ac"). InnerVolumeSpecName "kube-api-access-wmncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.473279 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util" (OuterVolumeSpecName: "util") pod "139b863f-62d1-47a8-b4ec-e39d769d02ac" (UID: "139b863f-62d1-47a8-b4ec-e39d769d02ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.557764 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139b863f-62d1-47a8-b4ec-e39d769d02ac-util\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.557803 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmncv\" (UniqueName: \"kubernetes.io/projected/139b863f-62d1-47a8-b4ec-e39d769d02ac-kube-api-access-wmncv\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.997796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" event={"ID":"139b863f-62d1-47a8-b4ec-e39d769d02ac","Type":"ContainerDied","Data":"1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94"} Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.998129 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa9b664360ed7d0229162cdbea07b288d46cb1fd63f80ed0ce3389da2fbde94" Mar 09 16:18:12 crc kubenswrapper[4831]: I0309 16:18:12.997826 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9" Mar 09 16:18:14 crc kubenswrapper[4831]: I0309 16:18:14.049617 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" podUID="215f54de-45d6-469b-bfec-3a77a1e3d3d8" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 16:18:18 crc kubenswrapper[4831]: E0309 16:18:18.298572 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:18 crc kubenswrapper[4831]: I0309 16:18:18.375983 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:18 crc kubenswrapper[4831]: I0309 16:18:18.492782 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-f95dc6db4-9zfcs" Mar 09 16:18:28 crc kubenswrapper[4831]: E0309 16:18:28.485924 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.032144 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf"] Mar 09 16:18:29 crc kubenswrapper[4831]: E0309 16:18:29.032763 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="util" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.032783 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="util" Mar 09 16:18:29 crc kubenswrapper[4831]: E0309 16:18:29.032811 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="extract" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.032819 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="extract" Mar 09 16:18:29 crc kubenswrapper[4831]: E0309 16:18:29.032833 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="pull" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.032840 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="pull" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.032984 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="139b863f-62d1-47a8-b4ec-e39d769d02ac" containerName="extract" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.033589 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.041269 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.041278 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hlxkc" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.055535 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf"] Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.202679 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qpg8\" (UniqueName: \"kubernetes.io/projected/a12c2319-1993-409d-ace0-5bc02371e54e-kube-api-access-6qpg8\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.202866 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-apiservice-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.202984 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-webhook-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.304507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-apiservice-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.304611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-webhook-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.304675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qpg8\" (UniqueName: \"kubernetes.io/projected/a12c2319-1993-409d-ace0-5bc02371e54e-kube-api-access-6qpg8\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.315598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-webhook-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.323534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a12c2319-1993-409d-ace0-5bc02371e54e-apiservice-cert\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.337018 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qpg8\" (UniqueName: \"kubernetes.io/projected/a12c2319-1993-409d-ace0-5bc02371e54e-kube-api-access-6qpg8\") pod \"swift-operator-controller-manager-6d856b55c6-8mbrf\" (UID: \"a12c2319-1993-409d-ace0-5bc02371e54e\") " pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.358513 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.765708 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf"] Mar 09 16:18:29 crc kubenswrapper[4831]: W0309 16:18:29.768600 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12c2319_1993_409d_ace0_5bc02371e54e.slice/crio-76109c1af35727f76ce27ced5ae97a251c8f6dbfa8cd02d18580877e58e693cd WatchSource:0}: Error finding container 76109c1af35727f76ce27ced5ae97a251c8f6dbfa8cd02d18580877e58e693cd: Status 404 returned error can't find the container with id 76109c1af35727f76ce27ced5ae97a251c8f6dbfa8cd02d18580877e58e693cd Mar 09 16:18:29 crc kubenswrapper[4831]: I0309 16:18:29.779359 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:18:30 crc kubenswrapper[4831]: I0309 16:18:30.128790 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" event={"ID":"a12c2319-1993-409d-ace0-5bc02371e54e","Type":"ContainerStarted","Data":"76109c1af35727f76ce27ced5ae97a251c8f6dbfa8cd02d18580877e58e693cd"} Mar 09 16:18:31 crc kubenswrapper[4831]: I0309 16:18:31.138456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" event={"ID":"a12c2319-1993-409d-ace0-5bc02371e54e","Type":"ContainerStarted","Data":"f683e3730356b164cb96766a63d7051732165c30925aa0d15b9e74e89814703f"} Mar 09 16:18:32 crc kubenswrapper[4831]: I0309 16:18:32.144480 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:32 crc kubenswrapper[4831]: I0309 16:18:32.164123 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" podStartSLOduration=1.954604873 podStartE2EDuration="3.1640884s" podCreationTimestamp="2026-03-09 16:18:29 +0000 UTC" firstStartedPulling="2026-03-09 16:18:29.779055807 +0000 UTC m=+1236.912738240" lastFinishedPulling="2026-03-09 16:18:30.988539344 +0000 UTC m=+1238.122221767" observedRunningTime="2026-03-09 16:18:32.160876278 +0000 UTC m=+1239.294558701" watchObservedRunningTime="2026-03-09 16:18:32.1640884 +0000 UTC m=+1239.297770823" Mar 09 16:18:33 crc kubenswrapper[4831]: I0309 16:18:33.019181 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:18:33 crc kubenswrapper[4831]: I0309 16:18:33.019246 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:18:33 crc kubenswrapper[4831]: I0309 16:18:33.019298 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:18:33 crc kubenswrapper[4831]: I0309 16:18:33.019970 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:18:33 crc kubenswrapper[4831]: I0309 16:18:33.020062 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c" gracePeriod=600 Mar 09 16:18:34 crc kubenswrapper[4831]: I0309 16:18:34.158879 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c" exitCode=0 Mar 09 16:18:34 crc kubenswrapper[4831]: I0309 16:18:34.158972 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c"} Mar 09 16:18:34 crc kubenswrapper[4831]: I0309 16:18:34.159294 4831 scope.go:117] "RemoveContainer" containerID="1edaf25bc17b1a3de007db1b821f8bf147583ed96a9d4890d9a1fd5ed460feab" Mar 09 16:18:35 crc kubenswrapper[4831]: I0309 16:18:35.168539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3"} Mar 09 16:18:38 crc kubenswrapper[4831]: E0309 16:18:38.647722 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:39 crc kubenswrapper[4831]: I0309 16:18:39.364898 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6d856b55c6-8mbrf" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.546283 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.551959 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.554638 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.554870 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.554888 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.554998 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-qkwlq" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.567652 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.728332 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.728443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hql\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.728513 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.730887 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.731025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832413 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832451 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hql\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: E0309 16:18:43.832566 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:43 crc kubenswrapper[4831]: E0309 16:18:43.832598 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:18:43 crc kubenswrapper[4831]: E0309 16:18:43.832684 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift podName:a7467fe3-df9b-419e-aff9-937d7ec2ebf9 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:44.332662351 +0000 UTC m=+1251.466344764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift") pod "swift-storage-0" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9") : configmap "swift-ring-files" not found Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.832858 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") device mount path \"/mnt/openstack/pv07\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.833048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.833209 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.855652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hql\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:43 crc kubenswrapper[4831]: I0309 16:18:43.856033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.044277 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc289"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.045598 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.047771 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.048038 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.054916 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.060825 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc289"] Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.065592 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-c9rmr ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-c9rmr ring-data-devices scripts swiftconf]: context canceled" pod="swift-kuttl-tests/swift-ring-rebalance-nc289" podUID="e4430cd2-e619-4f60-bc34-374605b3168d" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.089567 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-6q5c9"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.090644 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.098265 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc289"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.106315 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-6q5c9"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237171 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237215 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237349 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237667 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr72g\" (UniqueName: \"kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237953 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.237989 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.338981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339045 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339141 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339250 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339305 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr72g\" (UniqueName: \"kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.339967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.340082 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.340101 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.340143 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift podName:a7467fe3-df9b-419e-aff9-937d7ec2ebf9 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:45.340128143 +0000 UTC m=+1252.473810576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift") pod "swift-storage-0" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9") : configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.340144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.340183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.340181 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.340240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.340542 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.342726 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.343752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.344094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.344125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.356967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr72g\" (UniqueName: \"kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g\") pod \"swift-ring-rebalance-6q5c9\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.360301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr\") pod \"swift-ring-rebalance-nc289\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.406725 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.517486 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.546971 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645196 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645289 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645421 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.645513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts\") pod \"e4430cd2-e619-4f60-bc34-374605b3168d\" (UID: \"e4430cd2-e619-4f60-bc34-374605b3168d\") " Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.646253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts" (OuterVolumeSpecName: "scripts") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.646670 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.646807 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.669600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr" (OuterVolumeSpecName: "kube-api-access-c9rmr") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "kube-api-access-c9rmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.669684 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.669742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4430cd2-e619-4f60-bc34-374605b3168d" (UID: "e4430cd2-e619-4f60-bc34-374605b3168d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.685474 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.686558 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.711238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751211 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751239 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751249 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/e4430cd2-e619-4f60-bc34-374605b3168d-kube-api-access-c9rmr\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751259 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4430cd2-e619-4f60-bc34-374605b3168d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751269 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4430cd2-e619-4f60-bc34-374605b3168d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.751280 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4430cd2-e619-4f60-bc34-374605b3168d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.853347 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzr6\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.853884 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.854067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.854261 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.854476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.890190 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-6q5c9"] Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.955864 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.956054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.956184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.956314 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzr6\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.956077 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.956622 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-8dqck: configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.956716 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.956595 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: E0309 16:18:44.956763 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift podName:510bcd91-09f0-4434-8143-cf0cc958ef70 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:45.456743236 +0000 UTC m=+1252.590425659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift") pod "swift-proxy-76c998454c-8dqck" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70") : configmap "swift-ring-files" not found Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.957030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.962570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:44 crc kubenswrapper[4831]: I0309 16:18:44.972297 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzr6\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.362373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.362602 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.362745 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.362815 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift podName:a7467fe3-df9b-419e-aff9-937d7ec2ebf9 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:47.362795837 +0000 UTC m=+1254.496478340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift") pod "swift-storage-0" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9") : configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.464609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.465008 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.465204 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-8dqck: configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: E0309 16:18:45.465375 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift podName:510bcd91-09f0-4434-8143-cf0cc958ef70 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:46.46534364 +0000 UTC m=+1253.599026073 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift") pod "swift-proxy-76c998454c-8dqck" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70") : configmap "swift-ring-files" not found Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.527573 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc289" Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.527575 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" event={"ID":"f43ac66d-7421-4543-952d-76ed6a4f5b8e","Type":"ContainerStarted","Data":"e3beecb82478eabfe8bad070d6d14b41b163cf9e81ba3a7ad67db1fbfb511dbf"} Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.582524 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc289"] Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.588513 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc289"] Mar 09 16:18:45 crc kubenswrapper[4831]: I0309 16:18:45.625047 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4430cd2-e619-4f60-bc34-374605b3168d" path="/var/lib/kubelet/pods/e4430cd2-e619-4f60-bc34-374605b3168d/volumes" Mar 09 16:18:46 crc kubenswrapper[4831]: I0309 16:18:46.481504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:46 crc kubenswrapper[4831]: E0309 16:18:46.482974 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:46 crc kubenswrapper[4831]: E0309 16:18:46.483010 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-8dqck: configmap "swift-ring-files" not found Mar 09 16:18:46 crc kubenswrapper[4831]: E0309 16:18:46.483070 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift podName:510bcd91-09f0-4434-8143-cf0cc958ef70 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:48.483049282 +0000 UTC m=+1255.616731705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift") pod "swift-proxy-76c998454c-8dqck" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70") : configmap "swift-ring-files" not found Mar 09 16:18:47 crc kubenswrapper[4831]: I0309 16:18:47.398466 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:47 crc kubenswrapper[4831]: E0309 16:18:47.398750 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:47 crc kubenswrapper[4831]: E0309 16:18:47.398790 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:18:47 crc kubenswrapper[4831]: E0309 16:18:47.398874 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift podName:a7467fe3-df9b-419e-aff9-937d7ec2ebf9 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:51.398848261 +0000 UTC m=+1258.532530734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift") pod "swift-storage-0" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9") : configmap "swift-ring-files" not found Mar 09 16:18:48 crc kubenswrapper[4831]: I0309 16:18:48.515955 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:48 crc kubenswrapper[4831]: E0309 16:18:48.516178 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:48 crc kubenswrapper[4831]: E0309 16:18:48.516424 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-8dqck: configmap "swift-ring-files" not found Mar 09 16:18:48 crc kubenswrapper[4831]: E0309 16:18:48.516489 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift podName:510bcd91-09f0-4434-8143-cf0cc958ef70 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:52.516471341 +0000 UTC m=+1259.650153754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift") pod "swift-proxy-76c998454c-8dqck" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70") : configmap "swift-ring-files" not found Mar 09 16:18:48 crc kubenswrapper[4831]: I0309 16:18:48.546911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" event={"ID":"f43ac66d-7421-4543-952d-76ed6a4f5b8e","Type":"ContainerStarted","Data":"96b65a332167a8e9d0dffb822bfd00aebbc522a524f06acd70608fadf11fab4b"} Mar 09 16:18:48 crc kubenswrapper[4831]: I0309 16:18:48.564773 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" podStartSLOduration=1.6795846970000001 podStartE2EDuration="4.564757052s" podCreationTimestamp="2026-03-09 16:18:44 +0000 UTC" firstStartedPulling="2026-03-09 16:18:44.892654903 +0000 UTC m=+1252.026337326" lastFinishedPulling="2026-03-09 16:18:47.777827258 +0000 UTC m=+1254.911509681" observedRunningTime="2026-03-09 16:18:48.563000542 +0000 UTC m=+1255.696682965" watchObservedRunningTime="2026-03-09 16:18:48.564757052 +0000 UTC m=+1255.698439475" Mar 09 16:18:48 crc kubenswrapper[4831]: E0309 16:18:48.804118 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b2ff70_bfd2_41cd_a0f7_3922fb99b379.slice\": RecentStats: unable to find data in memory cache]" Mar 09 16:18:51 crc kubenswrapper[4831]: I0309 16:18:51.469742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:51 crc kubenswrapper[4831]: E0309 16:18:51.470078 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:51 crc kubenswrapper[4831]: E0309 16:18:51.470218 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:18:51 crc kubenswrapper[4831]: E0309 16:18:51.470298 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift podName:a7467fe3-df9b-419e-aff9-937d7ec2ebf9 nodeName:}" failed. No retries permitted until 2026-03-09 16:18:59.4702731 +0000 UTC m=+1266.603955523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift") pod "swift-storage-0" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9") : configmap "swift-ring-files" not found Mar 09 16:18:52 crc kubenswrapper[4831]: I0309 16:18:52.592136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:18:52 crc kubenswrapper[4831]: E0309 16:18:52.592337 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:18:52 crc kubenswrapper[4831]: E0309 16:18:52.592715 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-8dqck: configmap "swift-ring-files" not found Mar 09 16:18:52 crc kubenswrapper[4831]: E0309 16:18:52.592771 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift podName:510bcd91-09f0-4434-8143-cf0cc958ef70 nodeName:}" failed. No retries permitted until 2026-03-09 16:19:00.592754129 +0000 UTC m=+1267.726436552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift") pod "swift-proxy-76c998454c-8dqck" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70") : configmap "swift-ring-files" not found Mar 09 16:18:54 crc kubenswrapper[4831]: I0309 16:18:54.604589 4831 generic.go:334] "Generic (PLEG): container finished" podID="f43ac66d-7421-4543-952d-76ed6a4f5b8e" containerID="96b65a332167a8e9d0dffb822bfd00aebbc522a524f06acd70608fadf11fab4b" exitCode=0 Mar 09 16:18:54 crc kubenswrapper[4831]: I0309 16:18:54.604922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" event={"ID":"f43ac66d-7421-4543-952d-76ed6a4f5b8e","Type":"ContainerDied","Data":"96b65a332167a8e9d0dffb822bfd00aebbc522a524f06acd70608fadf11fab4b"} Mar 09 16:18:55 crc kubenswrapper[4831]: I0309 16:18:55.937023 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.045948 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.046024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.046051 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr72g\" (UniqueName: \"kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.046203 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.046244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.046281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices\") pod \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\" (UID: \"f43ac66d-7421-4543-952d-76ed6a4f5b8e\") " Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.047431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.047597 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.053688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g" (OuterVolumeSpecName: "kube-api-access-sr72g") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "kube-api-access-sr72g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.064051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts" (OuterVolumeSpecName: "scripts") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.068022 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.068270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f43ac66d-7421-4543-952d-76ed6a4f5b8e" (UID: "f43ac66d-7421-4543-952d-76ed6a4f5b8e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148380 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148454 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148464 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f43ac66d-7421-4543-952d-76ed6a4f5b8e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148475 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr72g\" (UniqueName: \"kubernetes.io/projected/f43ac66d-7421-4543-952d-76ed6a4f5b8e-kube-api-access-sr72g\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148488 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43ac66d-7421-4543-952d-76ed6a4f5b8e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.148498 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f43ac66d-7421-4543-952d-76ed6a4f5b8e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.620571 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" event={"ID":"f43ac66d-7421-4543-952d-76ed6a4f5b8e","Type":"ContainerDied","Data":"e3beecb82478eabfe8bad070d6d14b41b163cf9e81ba3a7ad67db1fbfb511dbf"} Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.620616 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3beecb82478eabfe8bad070d6d14b41b163cf9e81ba3a7ad67db1fbfb511dbf" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.620669 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-6q5c9" Mar 09 16:18:56 crc kubenswrapper[4831]: I0309 16:18:56.850647 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:18:58 crc kubenswrapper[4831]: I0309 16:18:58.473815 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:18:59 crc kubenswrapper[4831]: I0309 16:18:59.502081 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:59 crc kubenswrapper[4831]: I0309 16:18:59.519066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"swift-storage-0\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:18:59 crc kubenswrapper[4831]: I0309 16:18:59.768363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.127429 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.231029 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.619315 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.629385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"swift-proxy-76c998454c-8dqck\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.657276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"e03448a447762f0123c5d90d39b4b16747ac3c151b31b21ff6ade01c7c240d57"} Mar 09 16:19:00 crc kubenswrapper[4831]: I0309 16:19:00.921058 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:01 crc kubenswrapper[4831]: I0309 16:19:01.135763 4831 scope.go:117] "RemoveContainer" containerID="81196dc320296aaf9ae38b8a4fca71ef7f1383ff61dd0db391d67d1656f182d4" Mar 09 16:19:01 crc kubenswrapper[4831]: I0309 16:19:01.352369 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:19:01 crc kubenswrapper[4831]: I0309 16:19:01.674234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerStarted","Data":"305c98ecfc352eff8f5a82b92a09cd5eb97e4454bea21e66c77bfdb3663f8d67"} Mar 09 16:19:01 crc kubenswrapper[4831]: I0309 16:19:01.677820 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.682795 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerStarted","Data":"95b53319f51590a36e3d3c8aee332a07d219737d9d16d46de4d80bc5c5e1383a"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.683299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerStarted","Data":"acc09330be61f489114d47147c5bbcb3dccdaef387cf3b1e3f33dfca5c2a9433"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.683324 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.685163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.685276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.685294 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.685305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850"} Mar 09 16:19:02 crc kubenswrapper[4831]: I0309 16:19:02.703493 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" podStartSLOduration=18.703468221 podStartE2EDuration="18.703468221s" podCreationTimestamp="2026-03-09 16:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:19:02.701257877 +0000 UTC m=+1269.834940340" watchObservedRunningTime="2026-03-09 16:19:02.703468221 +0000 UTC m=+1269.837150654" Mar 09 16:19:03 crc kubenswrapper[4831]: I0309 16:19:03.316996 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:03 crc kubenswrapper[4831]: I0309 16:19:03.696536 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459"} Mar 09 16:19:03 crc kubenswrapper[4831]: I0309 16:19:03.696917 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:04 crc kubenswrapper[4831]: I0309 16:19:04.710127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3"} Mar 09 16:19:04 crc kubenswrapper[4831]: I0309 16:19:04.710190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862"} Mar 09 16:19:04 crc kubenswrapper[4831]: I0309 16:19:04.710207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4"} Mar 09 16:19:04 crc kubenswrapper[4831]: I0309 16:19:04.822541 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:06 crc kubenswrapper[4831]: I0309 16:19:06.317455 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:06 crc kubenswrapper[4831]: I0309 16:19:06.735515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706"} Mar 09 16:19:06 crc kubenswrapper[4831]: I0309 16:19:06.735815 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e"} Mar 09 16:19:06 crc kubenswrapper[4831]: I0309 16:19:06.735831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f"} Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.749430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b"} Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.749486 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df"} Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.749502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370"} Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.749516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerStarted","Data":"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5"} Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.783349 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=19.769578145 podStartE2EDuration="25.783331077s" podCreationTimestamp="2026-03-09 16:18:42 +0000 UTC" firstStartedPulling="2026-03-09 16:19:00.23668997 +0000 UTC m=+1267.370372393" lastFinishedPulling="2026-03-09 16:19:06.250442902 +0000 UTC m=+1273.384125325" observedRunningTime="2026-03-09 16:19:07.783051549 +0000 UTC m=+1274.916733972" watchObservedRunningTime="2026-03-09 16:19:07.783331077 +0000 UTC m=+1274.917013500" Mar 09 16:19:07 crc kubenswrapper[4831]: I0309 16:19:07.884291 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:09 crc kubenswrapper[4831]: I0309 16:19:09.489571 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:10 crc kubenswrapper[4831]: I0309 16:19:10.924672 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:10 crc kubenswrapper[4831]: I0309 16:19:10.925001 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:11 crc kubenswrapper[4831]: I0309 16:19:11.012297 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:12 crc kubenswrapper[4831]: I0309 16:19:12.577669 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-6q5c9_f43ac66d-7421-4543-952d-76ed6a4f5b8e/swift-ring-rebalance/0.log" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.951813 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:19:13 crc kubenswrapper[4831]: E0309 16:19:13.952280 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43ac66d-7421-4543-952d-76ed6a4f5b8e" containerName="swift-ring-rebalance" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.952297 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43ac66d-7421-4543-952d-76ed6a4f5b8e" containerName="swift-ring-rebalance" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.952513 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43ac66d-7421-4543-952d-76ed6a4f5b8e" containerName="swift-ring-rebalance" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.957824 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.971465 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.977629 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.986240 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:19:13 crc kubenswrapper[4831]: I0309 16:19:13.992519 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8lj\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139378 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrp7\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139706 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139740 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.139806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.140011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.140067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.140194 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.140225 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.241595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.241944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrp7\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242201 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242327 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.241961 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242645 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242733 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242771 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") device mount path \"/mnt/openstack/pv03\"" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.242914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.243026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8lj\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.243382 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.243542 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.251892 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.258335 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.259786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrp7\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.260240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8lj\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.263058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-1\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.265259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.286592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.300386 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.459832 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-6q5c9"] Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.467516 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-6q5c9"] Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.481213 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-gbx8d"] Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.482344 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.485889 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.485943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.493740 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-gbx8d"] Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.648674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.648773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.648961 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw77w\" (UniqueName: \"kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.649622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.649766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.649907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.751937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw77w\" (UniqueName: \"kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752226 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.752831 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.753189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.753616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.757073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.757789 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.772281 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw77w\" (UniqueName: \"kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w\") pod \"swift-ring-rebalance-gbx8d\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:14 crc kubenswrapper[4831]: I0309 16:19:14.802744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:15 crc kubenswrapper[4831]: W0309 16:19:15.295521 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42cfa737_89bb_4a99_be4f_5a5cbe39ecbb.slice/crio-c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb WatchSource:0}: Error finding container c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb: Status 404 returned error can't find the container with id c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.297595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-gbx8d"] Mar 09 16:19:15 crc kubenswrapper[4831]: W0309 16:19:15.312158 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011210e3_7d65_4d5c_ac25_61aba095f4d3.slice/crio-0c4390ab3e217983fc4a9b7d4c402339447b0fb7f06be8e8e589ce38c6370dd0 WatchSource:0}: Error finding container 0c4390ab3e217983fc4a9b7d4c402339447b0fb7f06be8e8e589ce38c6370dd0: Status 404 returned error can't find the container with id 0c4390ab3e217983fc4a9b7d4c402339447b0fb7f06be8e8e589ce38c6370dd0 Mar 09 16:19:15 crc kubenswrapper[4831]: W0309 16:19:15.313102 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaf1c9c4_a793_47fc_8ccd_ba4fefa866a3.slice/crio-26b325edb358f27de2b1d28ed076b7c4c28dec94796e61eb24975cb9a8ec4ecf WatchSource:0}: Error finding container 26b325edb358f27de2b1d28ed076b7c4c28dec94796e61eb24975cb9a8ec4ecf: Status 404 returned error can't find the container with id 26b325edb358f27de2b1d28ed076b7c4c28dec94796e61eb24975cb9a8ec4ecf Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.316125 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.324957 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.635989 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43ac66d-7421-4543-952d-76ed6a4f5b8e" path="/var/lib/kubelet/pods/f43ac66d-7421-4543-952d-76ed6a4f5b8e/volumes" Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.814462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.814507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.814517 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"0c4390ab3e217983fc4a9b7d4c402339447b0fb7f06be8e8e589ce38c6370dd0"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.817273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"2d9feedd7999580ce35529f36a37efa59abacaea45888b2f87a718645a627bde"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.817295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"a83dd135f4488ccdf12adcfdee48235d2b572891b0dffe8c14cd9b9979f44924"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.817305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"26b325edb358f27de2b1d28ed076b7c4c28dec94796e61eb24975cb9a8ec4ecf"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.818768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" event={"ID":"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb","Type":"ContainerStarted","Data":"57789b2b2e62070f312c166d1309ee9de26a37e6d216fcb2ebd44ebef8a3078e"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.818903 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" event={"ID":"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb","Type":"ContainerStarted","Data":"c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb"} Mar 09 16:19:15 crc kubenswrapper[4831]: I0309 16:19:15.843990 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" podStartSLOduration=1.843972503 podStartE2EDuration="1.843972503s" podCreationTimestamp="2026-03-09 16:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:19:15.838232949 +0000 UTC m=+1282.971915372" watchObservedRunningTime="2026-03-09 16:19:15.843972503 +0000 UTC m=+1282.977654916" Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"976f2ba6734acbd0ddec6ec6f925ae9962b5f2c1044b1d2b888da945054018ab"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"f17aefe7f1d3ca05dd1cfa58cb89640748c7cfcb2c2608111e4658175fdfe99d"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"63f6dec4ae171d3c4b3f29d7bf66cf1f1062e056f802fe2ed723715a55fe72ce"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"039b998d39aa2b9d3a92d268f6fcba343f7b76335126d2f8768f6d16e113cb2b"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836618 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"b9501d9e0a7820be98728618556d59404f9f3ed3f2c9b6d97f9700bb230ad537"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.836630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"8f3635741313b820c5ef3176bb9327666308bdef54b72d8d9e296324d907313b"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.840712 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.840738 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.840749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893"} Mar 09 16:19:16 crc kubenswrapper[4831]: I0309 16:19:16.840759 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.893042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"5123b7aed8b1aa990862ebb7b7820a0067a73b9e62dc2e624e345898276a24ab"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.893102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"4a8413055e381b938750f7aecdc3701f85696c224b3cbea18fdf9d9405d5395e"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.893116 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"bc7895791517c37c9aaa4594a12f31677b4d576d5ca9fa256c30176fad0d0604"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.893130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"508790ce92b23143997d6bf5ccf4c115640f9b605d49faee20edaafae8943978"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.893142 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"b6343f4e4d579be2e95f1898c3b5ae303d530d42b77554ccc010ef5b0217e8ea"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908223 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908267 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908293 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba"} Mar 09 16:19:17 crc kubenswrapper[4831]: I0309 16:19:17.908302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.923698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"b37ef22ade6ee827494c771600b3c7d8a082745e08e2b7b25b4734e04198b671"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.924809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerStarted","Data":"33019fcb1c761c34c0175a7c3ca2117559b6635fbee761fca05f81c92302de1a"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.931703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.931741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.931753 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerStarted","Data":"d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076"} Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.958544 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=6.9585190279999996 podStartE2EDuration="6.958519028s" podCreationTimestamp="2026-03-09 16:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:19:18.958363574 +0000 UTC m=+1286.092045997" watchObservedRunningTime="2026-03-09 16:19:18.958519028 +0000 UTC m=+1286.092201451" Mar 09 16:19:18 crc kubenswrapper[4831]: I0309 16:19:18.993432 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=6.993363335 podStartE2EDuration="6.993363335s" podCreationTimestamp="2026-03-09 16:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:19:18.989302589 +0000 UTC m=+1286.122985022" watchObservedRunningTime="2026-03-09 16:19:18.993363335 +0000 UTC m=+1286.127045758" Mar 09 16:19:24 crc kubenswrapper[4831]: I0309 16:19:24.986335 4831 generic.go:334] "Generic (PLEG): container finished" podID="42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" containerID="57789b2b2e62070f312c166d1309ee9de26a37e6d216fcb2ebd44ebef8a3078e" exitCode=0 Mar 09 16:19:24 crc kubenswrapper[4831]: I0309 16:19:24.986575 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" event={"ID":"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb","Type":"ContainerDied","Data":"57789b2b2e62070f312c166d1309ee9de26a37e6d216fcb2ebd44ebef8a3078e"} Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.305825 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448139 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw77w\" (UniqueName: \"kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448326 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448350 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.448511 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices\") pod \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\" (UID: \"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb\") " Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.449272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.449377 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.449952 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.450015 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.455038 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w" (OuterVolumeSpecName: "kube-api-access-qw77w") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "kube-api-access-qw77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.473276 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts" (OuterVolumeSpecName: "scripts") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.473682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.501595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" (UID: "42cfa737-89bb-4a99-be4f-5a5cbe39ecbb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.551110 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.551144 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.551153 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:26 crc kubenswrapper[4831]: I0309 16:19:26.551164 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw77w\" (UniqueName: \"kubernetes.io/projected/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb-kube-api-access-qw77w\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.007775 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" event={"ID":"42cfa737-89bb-4a99-be4f-5a5cbe39ecbb","Type":"ContainerDied","Data":"c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb"} Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.007832 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9e89fecd181130f50d49f46f17aa2b85279cb8ec3b2d13c83c7a6bf211d7ecb" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.007843 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-gbx8d" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.282729 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2"] Mar 09 16:19:27 crc kubenswrapper[4831]: E0309 16:19:27.283036 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" containerName="swift-ring-rebalance" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.283051 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" containerName="swift-ring-rebalance" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.283198 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" containerName="swift-ring-rebalance" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.283667 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.287786 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.288277 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.303187 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2"] Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.465838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.465936 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.465986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.466147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.466181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.466279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567673 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567754 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.567841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.568434 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.569066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.569480 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.573580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.580978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.594923 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq\") pod \"swift-ring-rebalance-debug-9k2x2\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.598780 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:27 crc kubenswrapper[4831]: I0309 16:19:27.871500 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2"] Mar 09 16:19:28 crc kubenswrapper[4831]: I0309 16:19:28.015982 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" event={"ID":"0b601693-855a-4f5e-9b33-c44a7af63c87","Type":"ContainerStarted","Data":"08a3a56d46fda9c72c1f82d3d5b50b753734300a1d62c739d5b73831ad95d574"} Mar 09 16:19:29 crc kubenswrapper[4831]: I0309 16:19:29.026293 4831 generic.go:334] "Generic (PLEG): container finished" podID="0b601693-855a-4f5e-9b33-c44a7af63c87" containerID="d8329dfa572f39031000426bc2a5ff246602505d4fcad021ae3c022a294a7885" exitCode=0 Mar 09 16:19:29 crc kubenswrapper[4831]: I0309 16:19:29.026372 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" event={"ID":"0b601693-855a-4f5e-9b33-c44a7af63c87","Type":"ContainerDied","Data":"d8329dfa572f39031000426bc2a5ff246602505d4fcad021ae3c022a294a7885"} Mar 09 16:19:29 crc kubenswrapper[4831]: I0309 16:19:29.082652 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2"] Mar 09 16:19:29 crc kubenswrapper[4831]: I0309 16:19:29.101415 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2"] Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.294250 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.410780 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.410835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.410911 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.410950 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.410980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.411008 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts\") pod \"0b601693-855a-4f5e-9b33-c44a7af63c87\" (UID: \"0b601693-855a-4f5e-9b33-c44a7af63c87\") " Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.411866 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.411941 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.415336 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq" (OuterVolumeSpecName: "kube-api-access-qt6kq") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "kube-api-access-qt6kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.432759 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.438014 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts" (OuterVolumeSpecName: "scripts") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.440028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0b601693-855a-4f5e-9b33-c44a7af63c87" (UID: "0b601693-855a-4f5e-9b33-c44a7af63c87"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.462208 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84fxk"] Mar 09 16:19:30 crc kubenswrapper[4831]: E0309 16:19:30.462550 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b601693-855a-4f5e-9b33-c44a7af63c87" containerName="swift-ring-rebalance" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.462570 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b601693-855a-4f5e-9b33-c44a7af63c87" containerName="swift-ring-rebalance" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.462752 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b601693-855a-4f5e-9b33-c44a7af63c87" containerName="swift-ring-rebalance" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.463249 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.476134 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84fxk"] Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525877 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525917 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b601693-855a-4f5e-9b33-c44a7af63c87-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525932 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/0b601693-855a-4f5e-9b33-c44a7af63c87-kube-api-access-qt6kq\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525944 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b601693-855a-4f5e-9b33-c44a7af63c87-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525956 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.525967 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b601693-855a-4f5e-9b33-c44a7af63c87-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.627897 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.627943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87xfg\" (UniqueName: \"kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.627966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.627988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.628102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.628178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.729809 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.730185 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.730237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.730272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87xfg\" (UniqueName: \"kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.730305 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.730338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.731026 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.731285 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.731326 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.734165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.735614 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.752364 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87xfg\" (UniqueName: \"kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg\") pod \"swift-ring-rebalance-debug-84fxk\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:30 crc kubenswrapper[4831]: I0309 16:19:30.841080 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:31 crc kubenswrapper[4831]: I0309 16:19:31.048682 4831 scope.go:117] "RemoveContainer" containerID="d8329dfa572f39031000426bc2a5ff246602505d4fcad021ae3c022a294a7885" Mar 09 16:19:31 crc kubenswrapper[4831]: I0309 16:19:31.048795 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9k2x2" Mar 09 16:19:31 crc kubenswrapper[4831]: I0309 16:19:31.118358 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84fxk"] Mar 09 16:19:31 crc kubenswrapper[4831]: W0309 16:19:31.125891 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ff6bf3_b5aa_44b3_ba52_17bcae4f1e35.slice/crio-62c377a59cae0a9b8c561095c98042c5ccf6bae53303a22662038f3fb55588e0 WatchSource:0}: Error finding container 62c377a59cae0a9b8c561095c98042c5ccf6bae53303a22662038f3fb55588e0: Status 404 returned error can't find the container with id 62c377a59cae0a9b8c561095c98042c5ccf6bae53303a22662038f3fb55588e0 Mar 09 16:19:31 crc kubenswrapper[4831]: I0309 16:19:31.630904 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b601693-855a-4f5e-9b33-c44a7af63c87" path="/var/lib/kubelet/pods/0b601693-855a-4f5e-9b33-c44a7af63c87/volumes" Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.059298 4831 generic.go:334] "Generic (PLEG): container finished" podID="85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" containerID="3deec4a5e2cb7160e46648e85a8255963ee455c768c78c5cf164ade921f2c927" exitCode=0 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.059669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" event={"ID":"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35","Type":"ContainerDied","Data":"3deec4a5e2cb7160e46648e85a8255963ee455c768c78c5cf164ade921f2c927"} Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.059699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" event={"ID":"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35","Type":"ContainerStarted","Data":"62c377a59cae0a9b8c561095c98042c5ccf6bae53303a22662038f3fb55588e0"} Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.111554 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84fxk"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.125943 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84fxk"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.345172 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.345665 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-server" containerID="cri-o://a83dd135f4488ccdf12adcfdee48235d2b572891b0dffe8c14cd9b9979f44924" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346023 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="swift-recon-cron" containerID="cri-o://b37ef22ade6ee827494c771600b3c7d8a082745e08e2b7b25b4734e04198b671" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346067 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="rsync" containerID="cri-o://33019fcb1c761c34c0175a7c3ca2117559b6635fbee761fca05f81c92302de1a" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346101 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-expirer" containerID="cri-o://5123b7aed8b1aa990862ebb7b7820a0067a73b9e62dc2e624e345898276a24ab" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346135 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-updater" containerID="cri-o://4a8413055e381b938750f7aecdc3701f85696c224b3cbea18fdf9d9405d5395e" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346174 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-auditor" containerID="cri-o://bc7895791517c37c9aaa4594a12f31677b4d576d5ca9fa256c30176fad0d0604" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346206 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-replicator" containerID="cri-o://508790ce92b23143997d6bf5ccf4c115640f9b605d49faee20edaafae8943978" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346237 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-server" containerID="cri-o://b6343f4e4d579be2e95f1898c3b5ae303d530d42b77554ccc010ef5b0217e8ea" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346265 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-updater" containerID="cri-o://976f2ba6734acbd0ddec6ec6f925ae9962b5f2c1044b1d2b888da945054018ab" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346297 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-auditor" containerID="cri-o://f17aefe7f1d3ca05dd1cfa58cb89640748c7cfcb2c2608111e4658175fdfe99d" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346331 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-replicator" containerID="cri-o://63f6dec4ae171d3c4b3f29d7bf66cf1f1062e056f802fe2ed723715a55fe72ce" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346360 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-server" containerID="cri-o://039b998d39aa2b9d3a92d268f6fcba343f7b76335126d2f8768f6d16e113cb2b" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346390 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-reaper" containerID="cri-o://b9501d9e0a7820be98728618556d59404f9f3ed3f2c9b6d97f9700bb230ad537" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346449 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-auditor" containerID="cri-o://8f3635741313b820c5ef3176bb9327666308bdef54b72d8d9e296324d907313b" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.346575 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-replicator" containerID="cri-o://2d9feedd7999580ce35529f36a37efa59abacaea45888b2f87a718645a627bde" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.358198 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.358915 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-server" containerID="cri-o://64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359191 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-updater" containerID="cri-o://2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359427 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-auditor" containerID="cri-o://58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359525 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-replicator" containerID="cri-o://7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359532 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="swift-recon-cron" containerID="cri-o://d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359617 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-server" containerID="cri-o://ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359669 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-reaper" containerID="cri-o://cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359716 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-auditor" containerID="cri-o://be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359768 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-replicator" containerID="cri-o://201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359830 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="rsync" containerID="cri-o://7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.359981 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-auditor" containerID="cri-o://e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.360225 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-expirer" containerID="cri-o://d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.360455 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-updater" containerID="cri-o://2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.360608 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-replicator" containerID="cri-o://608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.360767 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-server" containerID="cri-o://aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.370179 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.370817 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-server" containerID="cri-o://efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371231 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="swift-recon-cron" containerID="cri-o://6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371304 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="rsync" containerID="cri-o://5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371354 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-expirer" containerID="cri-o://6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371423 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-updater" containerID="cri-o://79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371476 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-auditor" containerID="cri-o://07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371526 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-replicator" containerID="cri-o://19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371570 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-server" containerID="cri-o://c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371620 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-updater" containerID="cri-o://bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371676 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-auditor" containerID="cri-o://ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371727 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-replicator" containerID="cri-o://7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371776 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-server" containerID="cri-o://08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371823 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-reaper" containerID="cri-o://07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371870 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-auditor" containerID="cri-o://dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.371914 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-replicator" containerID="cri-o://9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.376603 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-gbx8d"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.385285 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-gbx8d"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.422824 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.423063 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-httpd" containerID="cri-o://acc09330be61f489114d47147c5bbcb3dccdaef387cf3b1e3f33dfca5c2a9433" gracePeriod=30 Mar 09 16:19:32 crc kubenswrapper[4831]: I0309 16:19:32.423524 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-server" containerID="cri-o://95b53319f51590a36e3d3c8aee332a07d219737d9d16d46de4d80bc5c5e1383a" gracePeriod=30 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094348 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094383 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094412 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094425 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094437 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094456 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094464 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094472 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094479 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094487 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094495 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094502 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094509 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094517 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094595 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094619 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094651 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094672 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094706 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094716 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.094726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119668 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="33019fcb1c761c34c0175a7c3ca2117559b6635fbee761fca05f81c92302de1a" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119697 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="5123b7aed8b1aa990862ebb7b7820a0067a73b9e62dc2e624e345898276a24ab" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119706 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="4a8413055e381b938750f7aecdc3701f85696c224b3cbea18fdf9d9405d5395e" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119714 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="bc7895791517c37c9aaa4594a12f31677b4d576d5ca9fa256c30176fad0d0604" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119721 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="508790ce92b23143997d6bf5ccf4c115640f9b605d49faee20edaafae8943978" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"33019fcb1c761c34c0175a7c3ca2117559b6635fbee761fca05f81c92302de1a"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"5123b7aed8b1aa990862ebb7b7820a0067a73b9e62dc2e624e345898276a24ab"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"4a8413055e381b938750f7aecdc3701f85696c224b3cbea18fdf9d9405d5395e"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119811 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"bc7895791517c37c9aaa4594a12f31677b4d576d5ca9fa256c30176fad0d0604"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119827 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"508790ce92b23143997d6bf5ccf4c115640f9b605d49faee20edaafae8943978"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"b6343f4e4d579be2e95f1898c3b5ae303d530d42b77554ccc010ef5b0217e8ea"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119729 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="b6343f4e4d579be2e95f1898c3b5ae303d530d42b77554ccc010ef5b0217e8ea" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119869 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="976f2ba6734acbd0ddec6ec6f925ae9962b5f2c1044b1d2b888da945054018ab" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119884 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="f17aefe7f1d3ca05dd1cfa58cb89640748c7cfcb2c2608111e4658175fdfe99d" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"976f2ba6734acbd0ddec6ec6f925ae9962b5f2c1044b1d2b888da945054018ab"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119902 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="63f6dec4ae171d3c4b3f29d7bf66cf1f1062e056f802fe2ed723715a55fe72ce" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119913 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="039b998d39aa2b9d3a92d268f6fcba343f7b76335126d2f8768f6d16e113cb2b" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119925 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="b9501d9e0a7820be98728618556d59404f9f3ed3f2c9b6d97f9700bb230ad537" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119936 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="8f3635741313b820c5ef3176bb9327666308bdef54b72d8d9e296324d907313b" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119989 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="2d9feedd7999580ce35529f36a37efa59abacaea45888b2f87a718645a627bde" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120003 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="a83dd135f4488ccdf12adcfdee48235d2b572891b0dffe8c14cd9b9979f44924" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.119916 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"f17aefe7f1d3ca05dd1cfa58cb89640748c7cfcb2c2608111e4658175fdfe99d"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120067 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"63f6dec4ae171d3c4b3f29d7bf66cf1f1062e056f802fe2ed723715a55fe72ce"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120076 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"039b998d39aa2b9d3a92d268f6fcba343f7b76335126d2f8768f6d16e113cb2b"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"b9501d9e0a7820be98728618556d59404f9f3ed3f2c9b6d97f9700bb230ad537"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120092 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"8f3635741313b820c5ef3176bb9327666308bdef54b72d8d9e296324d907313b"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120099 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"2d9feedd7999580ce35529f36a37efa59abacaea45888b2f87a718645a627bde"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.120121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"a83dd135f4488ccdf12adcfdee48235d2b572891b0dffe8c14cd9b9979f44924"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126018 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126049 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126056 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126066 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126072 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126078 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126087 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126094 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126101 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126106 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126112 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126118 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126251 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126269 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126228 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126317 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126326 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126338 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126357 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.126389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.128353 4831 generic.go:334] "Generic (PLEG): container finished" podID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerID="95b53319f51590a36e3d3c8aee332a07d219737d9d16d46de4d80bc5c5e1383a" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.128370 4831 generic.go:334] "Generic (PLEG): container finished" podID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerID="acc09330be61f489114d47147c5bbcb3dccdaef387cf3b1e3f33dfca5c2a9433" exitCode=0 Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.128387 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerDied","Data":"95b53319f51590a36e3d3c8aee332a07d219737d9d16d46de4d80bc5c5e1383a"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.128447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerDied","Data":"acc09330be61f489114d47147c5bbcb3dccdaef387cf3b1e3f33dfca5c2a9433"} Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.304236 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.364497 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476382 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data\") pod \"510bcd91-09f0-4434-8143-cf0cc958ef70\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476480 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476617 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") pod \"510bcd91-09f0-4434-8143-cf0cc958ef70\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476668 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd\") pod \"510bcd91-09f0-4434-8143-cf0cc958ef70\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd\") pod \"510bcd91-09f0-4434-8143-cf0cc958ef70\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87xfg\" (UniqueName: \"kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476789 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf\") pod \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\" (UID: \"85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.476816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzr6\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6\") pod \"510bcd91-09f0-4434-8143-cf0cc958ef70\" (UID: \"510bcd91-09f0-4434-8143-cf0cc958ef70\") " Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.477098 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.477197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "510bcd91-09f0-4434-8143-cf0cc958ef70" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.477303 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.477821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "510bcd91-09f0-4434-8143-cf0cc958ef70" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.481637 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg" (OuterVolumeSpecName: "kube-api-access-87xfg") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "kube-api-access-87xfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.481873 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6" (OuterVolumeSpecName: "kube-api-access-fqzr6") pod "510bcd91-09f0-4434-8143-cf0cc958ef70" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70"). InnerVolumeSpecName "kube-api-access-fqzr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.481953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "510bcd91-09f0-4434-8143-cf0cc958ef70" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.502461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.511886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts" (OuterVolumeSpecName: "scripts") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.512420 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data" (OuterVolumeSpecName: "config-data") pod "510bcd91-09f0-4434-8143-cf0cc958ef70" (UID: "510bcd91-09f0-4434-8143-cf0cc958ef70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.514149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" (UID: "85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578444 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578476 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578486 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578494 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578504 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578512 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/510bcd91-09f0-4434-8143-cf0cc958ef70-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578521 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87xfg\" (UniqueName: \"kubernetes.io/projected/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-kube-api-access-87xfg\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578529 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578537 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzr6\" (UniqueName: \"kubernetes.io/projected/510bcd91-09f0-4434-8143-cf0cc958ef70-kube-api-access-fqzr6\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578545 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510bcd91-09f0-4434-8143-cf0cc958ef70-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.578553 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.626869 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cfa737-89bb-4a99-be4f-5a5cbe39ecbb" path="/var/lib/kubelet/pods/42cfa737-89bb-4a99-be4f-5a5cbe39ecbb/volumes" Mar 09 16:19:33 crc kubenswrapper[4831]: I0309 16:19:33.627488 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" path="/var/lib/kubelet/pods/85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35/volumes" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.136335 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84fxk" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.136349 4831 scope.go:117] "RemoveContainer" containerID="3deec4a5e2cb7160e46648e85a8255963ee455c768c78c5cf164ade921f2c927" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.138535 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" event={"ID":"510bcd91-09f0-4434-8143-cf0cc958ef70","Type":"ContainerDied","Data":"305c98ecfc352eff8f5a82b92a09cd5eb97e4454bea21e66c77bfdb3663f8d67"} Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.138569 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-8dqck" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.158997 4831 scope.go:117] "RemoveContainer" containerID="95b53319f51590a36e3d3c8aee332a07d219737d9d16d46de4d80bc5c5e1383a" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.194703 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.197556 4831 scope.go:117] "RemoveContainer" containerID="acc09330be61f489114d47147c5bbcb3dccdaef387cf3b1e3f33dfca5c2a9433" Mar 09 16:19:34 crc kubenswrapper[4831]: I0309 16:19:34.201258 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-8dqck"] Mar 09 16:19:35 crc kubenswrapper[4831]: I0309 16:19:35.632105 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" path="/var/lib/kubelet/pods/510bcd91-09f0-4434-8143-cf0cc958ef70/volumes" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.140363 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551220-pmk86"] Mar 09 16:20:00 crc kubenswrapper[4831]: E0309 16:20:00.143057 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-server" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.143110 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-server" Mar 09 16:20:00 crc kubenswrapper[4831]: E0309 16:20:00.143163 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" containerName="swift-ring-rebalance" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.143172 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" containerName="swift-ring-rebalance" Mar 09 16:20:00 crc kubenswrapper[4831]: E0309 16:20:00.143188 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-httpd" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.143194 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-httpd" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.144211 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-server" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.144289 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ff6bf3-b5aa-44b3-ba52-17bcae4f1e35" containerName="swift-ring-rebalance" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.144329 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="510bcd91-09f0-4434-8143-cf0cc958ef70" containerName="proxy-httpd" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.145259 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.146601 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551220-pmk86"] Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.147792 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.147992 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.148654 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.196267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqrs\" (UniqueName: \"kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs\") pod \"auto-csr-approver-29551220-pmk86\" (UID: \"fd860360-f0c0-4848-ac43-de53b118a65c\") " pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.298166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqrs\" (UniqueName: \"kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs\") pod \"auto-csr-approver-29551220-pmk86\" (UID: \"fd860360-f0c0-4848-ac43-de53b118a65c\") " pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.322045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqrs\" (UniqueName: \"kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs\") pod \"auto-csr-approver-29551220-pmk86\" (UID: \"fd860360-f0c0-4848-ac43-de53b118a65c\") " pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.481184 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:00 crc kubenswrapper[4831]: I0309 16:20:00.947023 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551220-pmk86"] Mar 09 16:20:01 crc kubenswrapper[4831]: I0309 16:20:01.395505 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551220-pmk86" event={"ID":"fd860360-f0c0-4848-ac43-de53b118a65c","Type":"ContainerStarted","Data":"34944229a44ff090eba9446d43f6b9a3d545cdcf51741a949e4f50bbb2f20f41"} Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.408937 4831 generic.go:334] "Generic (PLEG): container finished" podID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerID="b37ef22ade6ee827494c771600b3c7d8a082745e08e2b7b25b4734e04198b671" exitCode=137 Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.409013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"b37ef22ade6ee827494c771600b3c7d8a082745e08e2b7b25b4734e04198b671"} Mar 09 16:20:02 crc kubenswrapper[4831]: E0309 16:20:02.673118 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7467fe3_df9b_419e_aff9_937d7ec2ebf9.slice/crio-6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011210e3_7d65_4d5c_ac25_61aba095f4d3.slice/crio-conmon-d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7467fe3_df9b_419e_aff9_937d7ec2ebf9.slice/crio-conmon-6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b.scope\": RecentStats: unable to find data in memory cache]" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.718956 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.776591 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock\") pod \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift\") pod \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836611 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836648 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache\") pod \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836713 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8lj\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj\") pod \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\" (UID: \"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.836799 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock" (OuterVolumeSpecName: "lock") pod "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" (UID: "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.837136 4831 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.837242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache" (OuterVolumeSpecName: "cache") pod "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" (UID: "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.842454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "swift") pod "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" (UID: "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.842487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" (UID: "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.842521 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj" (OuterVolumeSpecName: "kube-api-access-gs8lj") pod "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" (UID: "eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3"). InnerVolumeSpecName "kube-api-access-gs8lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.844503 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.938259 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache\") pod \"011210e3-7d65-4d5c-ac25-61aba095f4d3\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.938559 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"011210e3-7d65-4d5c-ac25-61aba095f4d3\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.938777 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift\") pod \"011210e3-7d65-4d5c-ac25-61aba095f4d3\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.938966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrp7\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7\") pod \"011210e3-7d65-4d5c-ac25-61aba095f4d3\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.938996 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache" (OuterVolumeSpecName: "cache") pod "011210e3-7d65-4d5c-ac25-61aba095f4d3" (UID: "011210e3-7d65-4d5c-ac25-61aba095f4d3"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.939266 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock\") pod \"011210e3-7d65-4d5c-ac25-61aba095f4d3\" (UID: \"011210e3-7d65-4d5c-ac25-61aba095f4d3\") " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.940003 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.940056 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.940072 4831 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-cache\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.940085 4831 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-cache\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.940100 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8lj\" (UniqueName: \"kubernetes.io/projected/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3-kube-api-access-gs8lj\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.942314 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "011210e3-7d65-4d5c-ac25-61aba095f4d3" (UID: "011210e3-7d65-4d5c-ac25-61aba095f4d3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.942578 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock" (OuterVolumeSpecName: "lock") pod "011210e3-7d65-4d5c-ac25-61aba095f4d3" (UID: "011210e3-7d65-4d5c-ac25-61aba095f4d3"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.942634 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "011210e3-7d65-4d5c-ac25-61aba095f4d3" (UID: "011210e3-7d65-4d5c-ac25-61aba095f4d3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.945576 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7" (OuterVolumeSpecName: "kube-api-access-mnrp7") pod "011210e3-7d65-4d5c-ac25-61aba095f4d3" (UID: "011210e3-7d65-4d5c-ac25-61aba095f4d3"). InnerVolumeSpecName "kube-api-access-mnrp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:02 crc kubenswrapper[4831]: I0309 16:20:02.969826 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.041104 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2hql\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql\") pod \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.041164 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock\") pod \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.041214 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache\") pod \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.041948 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") pod \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\" (UID: \"a7467fe3-df9b-419e-aff9-937d7ec2ebf9\") " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042013 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock" (OuterVolumeSpecName: "lock") pod "a7467fe3-df9b-419e-aff9-937d7ec2ebf9" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042228 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache" (OuterVolumeSpecName: "cache") pod "a7467fe3-df9b-419e-aff9-937d7ec2ebf9" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042433 4831 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/011210e3-7d65-4d5c-ac25-61aba095f4d3-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042519 4831 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042572 4831 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-cache\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042632 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.042731 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.043451 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.043517 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrp7\" (UniqueName: \"kubernetes.io/projected/011210e3-7d65-4d5c-ac25-61aba095f4d3-kube-api-access-mnrp7\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.044456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql" (OuterVolumeSpecName: "kube-api-access-c2hql") pod "a7467fe3-df9b-419e-aff9-937d7ec2ebf9" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9"). InnerVolumeSpecName "kube-api-access-c2hql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.044808 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "a7467fe3-df9b-419e-aff9-937d7ec2ebf9" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.045336 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a7467fe3-df9b-419e-aff9-937d7ec2ebf9" (UID: "a7467fe3-df9b-419e-aff9-937d7ec2ebf9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.055052 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.144768 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.144815 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.144856 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.144871 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2hql\" (UniqueName: \"kubernetes.io/projected/a7467fe3-df9b-419e-aff9-937d7ec2ebf9-kube-api-access-c2hql\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.159916 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.246436 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.440004 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerID="6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b" exitCode=137 Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.440083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.440746 4831 scope.go:117] "RemoveContainer" containerID="6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.440145 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.440980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"a7467fe3-df9b-419e-aff9-937d7ec2ebf9","Type":"ContainerDied","Data":"e03448a447762f0123c5d90d39b4b16747ac3c151b31b21ff6ade01c7c240d57"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.444492 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd860360-f0c0-4848-ac43-de53b118a65c" containerID="f47df54652f825009ca5aa8dd5b88e3f9142358f0ccc1d1b277907bc44deca6e" exitCode=0 Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.444547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551220-pmk86" event={"ID":"fd860360-f0c0-4848-ac43-de53b118a65c","Type":"ContainerDied","Data":"f47df54652f825009ca5aa8dd5b88e3f9142358f0ccc1d1b277907bc44deca6e"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.452322 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3","Type":"ContainerDied","Data":"26b325edb358f27de2b1d28ed076b7c4c28dec94796e61eb24975cb9a8ec4ecf"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.452531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464615 4831 generic.go:334] "Generic (PLEG): container finished" podID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerID="d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397" exitCode=137 Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464677 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464700 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464706 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464711 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464716 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464721 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464727 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464732 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464737 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"011210e3-7d65-4d5c-ac25-61aba095f4d3","Type":"ContainerDied","Data":"0c4390ab3e217983fc4a9b7d4c402339447b0fb7f06be8e8e589ce38c6370dd0"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464756 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464761 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464766 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464771 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464775 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464780 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464786 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464791 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464797 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464803 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464815 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464823 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464830 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464838 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464845 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5"} Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.464968 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.477039 4831 scope.go:117] "RemoveContainer" containerID="5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.501638 4831 scope.go:117] "RemoveContainer" containerID="6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.502844 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.513895 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.521079 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.533343 4831 scope.go:117] "RemoveContainer" containerID="79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.534016 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.541576 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.548232 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.550667 4831 scope.go:117] "RemoveContainer" containerID="07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.564973 4831 scope.go:117] "RemoveContainer" containerID="19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.579488 4831 scope.go:117] "RemoveContainer" containerID="c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.594884 4831 scope.go:117] "RemoveContainer" containerID="bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.625197 4831 scope.go:117] "RemoveContainer" containerID="ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.629915 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" path="/var/lib/kubelet/pods/011210e3-7d65-4d5c-ac25-61aba095f4d3/volumes" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.631618 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" path="/var/lib/kubelet/pods/a7467fe3-df9b-419e-aff9-937d7ec2ebf9/volumes" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.633545 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" path="/var/lib/kubelet/pods/eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3/volumes" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.646157 4831 scope.go:117] "RemoveContainer" containerID="7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.670162 4831 scope.go:117] "RemoveContainer" containerID="08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.689208 4831 scope.go:117] "RemoveContainer" containerID="07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.702433 4831 scope.go:117] "RemoveContainer" containerID="dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.714370 4831 scope.go:117] "RemoveContainer" containerID="9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.730653 4831 scope.go:117] "RemoveContainer" containerID="efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.743747 4831 scope.go:117] "RemoveContainer" containerID="6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.744105 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b\": container with ID starting with 6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b not found: ID does not exist" containerID="6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744142 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b"} err="failed to get container status \"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b\": rpc error: code = NotFound desc = could not find container \"6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b\": container with ID starting with 6a8c3d453e673803c6bff897a597a30ee4ccbf8d0261a6ca1404dd690251807b not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744164 4831 scope.go:117] "RemoveContainer" containerID="5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.744438 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df\": container with ID starting with 5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df not found: ID does not exist" containerID="5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744482 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df"} err="failed to get container status \"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df\": rpc error: code = NotFound desc = could not find container \"5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df\": container with ID starting with 5c4765f9b0eb8359988b11e3806bb3b8b489e5a5d4ac9e2e94ab6a10c21cb4df not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744513 4831 scope.go:117] "RemoveContainer" containerID="6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.744774 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370\": container with ID starting with 6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370 not found: ID does not exist" containerID="6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744803 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370"} err="failed to get container status \"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370\": rpc error: code = NotFound desc = could not find container \"6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370\": container with ID starting with 6c72c4f5b1669b5670369d29e546b1be5c7d19b48005177dd7ef383f952f5370 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744816 4831 scope.go:117] "RemoveContainer" containerID="79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.744955 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5\": container with ID starting with 79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5 not found: ID does not exist" containerID="79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744975 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5"} err="failed to get container status \"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5\": rpc error: code = NotFound desc = could not find container \"79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5\": container with ID starting with 79cd55734b64e0277a5ebcafa58a5e68d3ba195c7e89d64e649eb51cdfa043b5 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.744989 4831 scope.go:117] "RemoveContainer" containerID="07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.745296 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706\": container with ID starting with 07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706 not found: ID does not exist" containerID="07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.745335 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706"} err="failed to get container status \"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706\": rpc error: code = NotFound desc = could not find container \"07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706\": container with ID starting with 07a30ef33453b7d4f35cf10cf77bc22043f0bcd7bc04f337ad461b80d9cd9706 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.745357 4831 scope.go:117] "RemoveContainer" containerID="19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.745585 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e\": container with ID starting with 19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e not found: ID does not exist" containerID="19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.745608 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e"} err="failed to get container status \"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e\": rpc error: code = NotFound desc = could not find container \"19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e\": container with ID starting with 19e35dcbd83f84afac78d5f7a6d199974ec3d562a41bc88968cb22dcb762f92e not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.745620 4831 scope.go:117] "RemoveContainer" containerID="c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.746024 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f\": container with ID starting with c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f not found: ID does not exist" containerID="c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746055 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f"} err="failed to get container status \"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f\": rpc error: code = NotFound desc = could not find container \"c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f\": container with ID starting with c3cb977789b062b6e241c1b72086a5770237ff75c61f423f92a02cd78549e74f not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746073 4831 scope.go:117] "RemoveContainer" containerID="bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.746306 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3\": container with ID starting with bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3 not found: ID does not exist" containerID="bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746328 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3"} err="failed to get container status \"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3\": rpc error: code = NotFound desc = could not find container \"bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3\": container with ID starting with bcec11f5b951f5e512a0040a54496444a9bd0876f1ee763feef4a295ce4bfca3 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746342 4831 scope.go:117] "RemoveContainer" containerID="ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.746601 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862\": container with ID starting with ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862 not found: ID does not exist" containerID="ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746631 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862"} err="failed to get container status \"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862\": rpc error: code = NotFound desc = could not find container \"ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862\": container with ID starting with ae1abf22071d68fd2cf9ed23ac2e9fb5d0e866312ffd2c44b3b3126b96b81862 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746650 4831 scope.go:117] "RemoveContainer" containerID="7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.746904 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4\": container with ID starting with 7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4 not found: ID does not exist" containerID="7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746926 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4"} err="failed to get container status \"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4\": rpc error: code = NotFound desc = could not find container \"7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4\": container with ID starting with 7ae9616dfa605f7f7d9705349539dc14256ef54d0cdc2672c3b5680d9e0274f4 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.746938 4831 scope.go:117] "RemoveContainer" containerID="08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.747215 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459\": container with ID starting with 08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459 not found: ID does not exist" containerID="08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747236 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459"} err="failed to get container status \"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459\": rpc error: code = NotFound desc = could not find container \"08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459\": container with ID starting with 08ff26d64053c8f5cc1dfc607316d2ffe75902931fda2880fe76603f16440459 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747247 4831 scope.go:117] "RemoveContainer" containerID="07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.747494 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502\": container with ID starting with 07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502 not found: ID does not exist" containerID="07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747537 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502"} err="failed to get container status \"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502\": rpc error: code = NotFound desc = could not find container \"07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502\": container with ID starting with 07990ff9dce6b67fe84135f0fb355ad3fbef0ab54003b2633c05900c1010c502 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747556 4831 scope.go:117] "RemoveContainer" containerID="dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.747866 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91\": container with ID starting with dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91 not found: ID does not exist" containerID="dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747885 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91"} err="failed to get container status \"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91\": rpc error: code = NotFound desc = could not find container \"dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91\": container with ID starting with dfe68baa4132d663e21cf302e29824812cbf394c16637d0445bd1336baca5d91 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.747897 4831 scope.go:117] "RemoveContainer" containerID="9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.748124 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11\": container with ID starting with 9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11 not found: ID does not exist" containerID="9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.748152 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11"} err="failed to get container status \"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11\": rpc error: code = NotFound desc = could not find container \"9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11\": container with ID starting with 9e1bb6ffbdb901b8ef8a4c6e0397b8f5636319438687999d57414259b92daf11 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.748168 4831 scope.go:117] "RemoveContainer" containerID="efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850" Mar 09 16:20:03 crc kubenswrapper[4831]: E0309 16:20:03.748446 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850\": container with ID starting with efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850 not found: ID does not exist" containerID="efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.748466 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850"} err="failed to get container status \"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850\": rpc error: code = NotFound desc = could not find container \"efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850\": container with ID starting with efa891f846e6ccd9b090b0c5b2efc37135716ba531930516c3924636d1b25850 not found: ID does not exist" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.748483 4831 scope.go:117] "RemoveContainer" containerID="b37ef22ade6ee827494c771600b3c7d8a082745e08e2b7b25b4734e04198b671" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.764117 4831 scope.go:117] "RemoveContainer" containerID="33019fcb1c761c34c0175a7c3ca2117559b6635fbee761fca05f81c92302de1a" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.787712 4831 scope.go:117] "RemoveContainer" containerID="5123b7aed8b1aa990862ebb7b7820a0067a73b9e62dc2e624e345898276a24ab" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.809169 4831 scope.go:117] "RemoveContainer" containerID="4a8413055e381b938750f7aecdc3701f85696c224b3cbea18fdf9d9405d5395e" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.828764 4831 scope.go:117] "RemoveContainer" containerID="bc7895791517c37c9aaa4594a12f31677b4d576d5ca9fa256c30176fad0d0604" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.851869 4831 scope.go:117] "RemoveContainer" containerID="508790ce92b23143997d6bf5ccf4c115640f9b605d49faee20edaafae8943978" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.872616 4831 scope.go:117] "RemoveContainer" containerID="b6343f4e4d579be2e95f1898c3b5ae303d530d42b77554ccc010ef5b0217e8ea" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.892266 4831 scope.go:117] "RemoveContainer" containerID="976f2ba6734acbd0ddec6ec6f925ae9962b5f2c1044b1d2b888da945054018ab" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.912812 4831 scope.go:117] "RemoveContainer" containerID="f17aefe7f1d3ca05dd1cfa58cb89640748c7cfcb2c2608111e4658175fdfe99d" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.935947 4831 scope.go:117] "RemoveContainer" containerID="63f6dec4ae171d3c4b3f29d7bf66cf1f1062e056f802fe2ed723715a55fe72ce" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.958464 4831 scope.go:117] "RemoveContainer" containerID="039b998d39aa2b9d3a92d268f6fcba343f7b76335126d2f8768f6d16e113cb2b" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.975621 4831 scope.go:117] "RemoveContainer" containerID="b9501d9e0a7820be98728618556d59404f9f3ed3f2c9b6d97f9700bb230ad537" Mar 09 16:20:03 crc kubenswrapper[4831]: I0309 16:20:03.992061 4831 scope.go:117] "RemoveContainer" containerID="8f3635741313b820c5ef3176bb9327666308bdef54b72d8d9e296324d907313b" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.010857 4831 scope.go:117] "RemoveContainer" containerID="2d9feedd7999580ce35529f36a37efa59abacaea45888b2f87a718645a627bde" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.029204 4831 scope.go:117] "RemoveContainer" containerID="a83dd135f4488ccdf12adcfdee48235d2b572891b0dffe8c14cd9b9979f44924" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.046316 4831 scope.go:117] "RemoveContainer" containerID="d4b16cc296f82bbb0cc9fc486ae320532c25270e7ce85fc0109ed66a8fd15397" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.063937 4831 scope.go:117] "RemoveContainer" containerID="7477ffb26b774c31b2036c09a4ca36d0999c7dfce001daa275b9f92dc38d58eb" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.081793 4831 scope.go:117] "RemoveContainer" containerID="d98db244fd35d03d7a324c841a7e5cf6ae32bdd81eea4970384326b638379076" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.097990 4831 scope.go:117] "RemoveContainer" containerID="2c18f77730b0eacad76c1cf8638cccd61eb09ddc280de87a12beb2cfb4357177" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.118260 4831 scope.go:117] "RemoveContainer" containerID="e489103e6600750cab6dc07084d6af8aba0b320f8c4c1567aeae5d7b33791fe6" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.140141 4831 scope.go:117] "RemoveContainer" containerID="608fce74f9786ecf9834b623f3635243be5af8d3bd83d99fb177b2d5ab2bad9c" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.746217 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.866043 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsqrs\" (UniqueName: \"kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs\") pod \"fd860360-f0c0-4848-ac43-de53b118a65c\" (UID: \"fd860360-f0c0-4848-ac43-de53b118a65c\") " Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.871332 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs" (OuterVolumeSpecName: "kube-api-access-hsqrs") pod "fd860360-f0c0-4848-ac43-de53b118a65c" (UID: "fd860360-f0c0-4848-ac43-de53b118a65c"). InnerVolumeSpecName "kube-api-access-hsqrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:04 crc kubenswrapper[4831]: I0309 16:20:04.967038 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsqrs\" (UniqueName: \"kubernetes.io/projected/fd860360-f0c0-4848-ac43-de53b118a65c-kube-api-access-hsqrs\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:05 crc kubenswrapper[4831]: I0309 16:20:05.487454 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551220-pmk86" event={"ID":"fd860360-f0c0-4848-ac43-de53b118a65c","Type":"ContainerDied","Data":"34944229a44ff090eba9446d43f6b9a3d545cdcf51741a949e4f50bbb2f20f41"} Mar 09 16:20:05 crc kubenswrapper[4831]: I0309 16:20:05.487513 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34944229a44ff090eba9446d43f6b9a3d545cdcf51741a949e4f50bbb2f20f41" Mar 09 16:20:05 crc kubenswrapper[4831]: I0309 16:20:05.487531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551220-pmk86" Mar 09 16:20:05 crc kubenswrapper[4831]: I0309 16:20:05.810667 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551214-zmwwz"] Mar 09 16:20:05 crc kubenswrapper[4831]: I0309 16:20:05.821715 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551214-zmwwz"] Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.462830 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463075 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463086 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463096 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463102 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463112 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463119 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463127 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd860360-f0c0-4848-ac43-de53b118a65c" containerName="oc" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463133 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd860360-f0c0-4848-ac43-de53b118a65c" containerName="oc" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463140 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463145 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463153 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463158 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463168 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463174 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463189 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463195 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463201 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463207 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463218 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463223 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463230 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463245 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463251 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463258 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463263 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463272 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463277 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463285 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463290 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463299 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463305 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463312 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463317 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463324 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463329 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463338 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463344 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463446 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463454 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463462 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463468 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463477 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463483 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463492 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463497 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463507 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463513 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463520 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463526 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463534 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463540 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463560 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463565 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463574 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463579 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463587 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463593 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463601 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463607 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463615 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463620 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463628 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463634 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463644 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463650 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463659 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463665 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463673 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463679 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463685 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463691 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463707 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463714 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463720 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463728 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463735 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463746 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463754 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463766 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463772 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463783 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463789 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463798 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463804 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463813 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463819 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463828 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463833 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.463841 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463847 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463958 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463972 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463984 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463991 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd860360-f0c0-4848-ac43-de53b118a65c" containerName="oc" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.463997 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464005 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464016 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464023 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464030 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464039 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464045 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464053 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464060 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464068 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464075 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464082 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464089 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464097 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464105 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464115 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464124 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464129 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464135 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464144 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464152 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464159 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="rsync" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464166 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464174 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464181 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464187 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464193 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464202 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464209 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464216 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464225 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464232 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="swift-recon-cron" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464238 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464247 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="account-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464255 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464261 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="container-auditor" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464270 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="account-reaper" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464278 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1c9c4-a793-47fc-8ccd-ba4fefa866a3" containerName="container-replicator" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464285 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464292 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="container-server" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464298 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="011210e3-7d65-4d5c-ac25-61aba095f4d3" containerName="object-expirer" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.464306 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7467fe3-df9b-419e-aff9-937d7ec2ebf9" containerName="object-updater" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.468345 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.474315 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.474504 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.475314 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-l6488" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.475986 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.496770 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.589613 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.589674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.589703 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7r6\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.589744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.589793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.690663 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.690718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7r6\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.690758 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.690808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.690831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.691227 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.691323 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.691264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.691232 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.691524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: E0309 16:20:06.691629 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift podName:11a56635-2024-4f89-98be-207e7a1176fe nodeName:}" failed. No retries permitted until 2026-03-09 16:20:07.191467484 +0000 UTC m=+1334.325149907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift") pod "swift-storage-0" (UID: "11a56635-2024-4f89-98be-207e7a1176fe") : configmap "swift-ring-files" not found Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.716065 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7r6\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.716773 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.856129 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pnsn2"] Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.857322 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.860349 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.860659 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.861514 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.880950 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pnsn2"] Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.995453 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.995532 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.995711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.995968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.996300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:06 crc kubenswrapper[4831]: I0309 16:20:06.996375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx77w\" (UniqueName: \"kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx77w\" (UniqueName: \"kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098296 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.098475 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.099179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.099194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.099872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.101982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.104277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.126255 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx77w\" (UniqueName: \"kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w\") pod \"swift-ring-rebalance-pnsn2\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.179843 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.199553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:07 crc kubenswrapper[4831]: E0309 16:20:07.199780 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:20:07 crc kubenswrapper[4831]: E0309 16:20:07.199843 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:20:07 crc kubenswrapper[4831]: E0309 16:20:07.199931 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift podName:11a56635-2024-4f89-98be-207e7a1176fe nodeName:}" failed. No retries permitted until 2026-03-09 16:20:08.199904994 +0000 UTC m=+1335.333587467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift") pod "swift-storage-0" (UID: "11a56635-2024-4f89-98be-207e7a1176fe") : configmap "swift-ring-files" not found Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.633014 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7153cf3a-f458-4a17-b759-85d90d63d60a" path="/var/lib/kubelet/pods/7153cf3a-f458-4a17-b759-85d90d63d60a/volumes" Mar 09 16:20:07 crc kubenswrapper[4831]: I0309 16:20:07.646262 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pnsn2"] Mar 09 16:20:07 crc kubenswrapper[4831]: W0309 16:20:07.649236 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab982c0_f334_4924_beeb_912d170378d5.slice/crio-859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48 WatchSource:0}: Error finding container 859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48: Status 404 returned error can't find the container with id 859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48 Mar 09 16:20:08 crc kubenswrapper[4831]: I0309 16:20:08.213285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:08 crc kubenswrapper[4831]: E0309 16:20:08.213632 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:20:08 crc kubenswrapper[4831]: E0309 16:20:08.213671 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:20:08 crc kubenswrapper[4831]: E0309 16:20:08.213790 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift podName:11a56635-2024-4f89-98be-207e7a1176fe nodeName:}" failed. No retries permitted until 2026-03-09 16:20:10.213754687 +0000 UTC m=+1337.347437170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift") pod "swift-storage-0" (UID: "11a56635-2024-4f89-98be-207e7a1176fe") : configmap "swift-ring-files" not found Mar 09 16:20:08 crc kubenswrapper[4831]: I0309 16:20:08.532337 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" event={"ID":"1ab982c0-f334-4924-beeb-912d170378d5","Type":"ContainerStarted","Data":"41d421601f49d2f213d5cefdb7ce9ad6982cc9eb06b4103d3fc4f2aac1be507a"} Mar 09 16:20:08 crc kubenswrapper[4831]: I0309 16:20:08.532432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" event={"ID":"1ab982c0-f334-4924-beeb-912d170378d5","Type":"ContainerStarted","Data":"859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48"} Mar 09 16:20:08 crc kubenswrapper[4831]: I0309 16:20:08.559867 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" podStartSLOduration=2.559844793 podStartE2EDuration="2.559844793s" podCreationTimestamp="2026-03-09 16:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:20:08.554238382 +0000 UTC m=+1335.687920805" watchObservedRunningTime="2026-03-09 16:20:08.559844793 +0000 UTC m=+1335.693527226" Mar 09 16:20:10 crc kubenswrapper[4831]: I0309 16:20:10.248697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:10 crc kubenswrapper[4831]: E0309 16:20:10.248919 4831 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 16:20:10 crc kubenswrapper[4831]: E0309 16:20:10.249237 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 16:20:10 crc kubenswrapper[4831]: E0309 16:20:10.249303 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift podName:11a56635-2024-4f89-98be-207e7a1176fe nodeName:}" failed. No retries permitted until 2026-03-09 16:20:14.249285145 +0000 UTC m=+1341.382967568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift") pod "swift-storage-0" (UID: "11a56635-2024-4f89-98be-207e7a1176fe") : configmap "swift-ring-files" not found Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.310591 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.333876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"swift-storage-0\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.586508 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.587985 4831 generic.go:334] "Generic (PLEG): container finished" podID="1ab982c0-f334-4924-beeb-912d170378d5" containerID="41d421601f49d2f213d5cefdb7ce9ad6982cc9eb06b4103d3fc4f2aac1be507a" exitCode=0 Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.588059 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" event={"ID":"1ab982c0-f334-4924-beeb-912d170378d5","Type":"ContainerDied","Data":"41d421601f49d2f213d5cefdb7ce9ad6982cc9eb06b4103d3fc4f2aac1be507a"} Mar 09 16:20:14 crc kubenswrapper[4831]: I0309 16:20:14.856247 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.597107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"bd11ab7de11a331b0134fa2038dd2fea81edc6e95ade077be08319319dd86742"} Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.597518 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"ba5071b51b0c0f395d04955d5cc3b833d94fb61752ff6aa3c2a98edc18ec75ea"} Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.597534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"52c06dba774928e5339332493c207e503eee09c0c77395cab009b68d9dc446c8"} Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.597545 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"0d0615e21966787026795ed21aec5a8b9150ef6e98421a2dea83c8969206ad36"} Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.597556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"c1cb9eff57a84c22211bbd32b81d7e8e1519c4ddccd6e902928bb74c734e4398"} Mar 09 16:20:15 crc kubenswrapper[4831]: I0309 16:20:15.974524 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.147748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.148029 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.148072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx77w\" (UniqueName: \"kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.148105 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.148144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.148203 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices\") pod \"1ab982c0-f334-4924-beeb-912d170378d5\" (UID: \"1ab982c0-f334-4924-beeb-912d170378d5\") " Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.149058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.149196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.153564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w" (OuterVolumeSpecName: "kube-api-access-wx77w") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "kube-api-access-wx77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.159682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.177672 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.179082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts" (OuterVolumeSpecName: "scripts") pod "1ab982c0-f334-4924-beeb-912d170378d5" (UID: "1ab982c0-f334-4924-beeb-912d170378d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249411 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx77w\" (UniqueName: \"kubernetes.io/projected/1ab982c0-f334-4924-beeb-912d170378d5-kube-api-access-wx77w\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249442 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1ab982c0-f334-4924-beeb-912d170378d5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249451 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249460 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249469 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1ab982c0-f334-4924-beeb-912d170378d5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.249477 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab982c0-f334-4924-beeb-912d170378d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"94847ce5f0acec892e5cd89129e447e396a43060107ba95f8c9b139186d9eb0c"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607089 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"aaf90ce0ab9bffc7318f115d84669f3df2de828d8b5c3f705b46a2267c62ae7c"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607099 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"0c7e5174d9f672056afce49e92e06209ed7d7d94d12faa9878b386f540aae91c"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607108 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"4e1ddecb424646252918e3a4fadf6abbc887ee68bff6e3eb9bd5e9554affb3ee"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"28154e98042b7c18f39a657cd1e47647018cb477b9c733e3c3a7200e09660b22"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607139 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"e90d8f1abbb3b1a24dde3c4a5692e4e3558f9fe6f7c6d524d7d4648442f6669f"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.607148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"08dc9956f0db87fd55c19fef4d06f06377b555bf9915bea9307ac658a572a474"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.608768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" event={"ID":"1ab982c0-f334-4924-beeb-912d170378d5","Type":"ContainerDied","Data":"859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48"} Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.608790 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859fe22936d59fc0e03a328584bf8d240ab3d84cb4f3e6e166f52a669b99de48" Mar 09 16:20:16 crc kubenswrapper[4831]: I0309 16:20:16.608834 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pnsn2" Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.624329 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"03ce249750b91f5a4458b3f2fcebe229ac19f428bff35a9577f87ce0c686fcb8"} Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.624627 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"cd92eef9b48e2fbc409ddacc00af540edf1d62d35e807b6d8fbd4de17e75fbcf"} Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.624636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"5bb292b424172609a8fdd92d252e1a716d5502e9e585df3b898f566dc4a99933"} Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.624645 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"ff7554df134a9b987aee98588a94d5b4291b3007d236dab11ef4a5269c5ec155"} Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.624653 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerStarted","Data":"1b1b7ad9b3b25fc4186459e5b20227c37302368c4c8864f045098981360f9f7d"} Mar 09 16:20:17 crc kubenswrapper[4831]: I0309 16:20:17.669508 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=12.669490848 podStartE2EDuration="12.669490848s" podCreationTimestamp="2026-03-09 16:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:20:17.658518454 +0000 UTC m=+1344.792200877" watchObservedRunningTime="2026-03-09 16:20:17.669490848 +0000 UTC m=+1344.803173271" Mar 09 16:20:35 crc kubenswrapper[4831]: I0309 16:20:35.992673 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:20:35 crc kubenswrapper[4831]: E0309 16:20:35.995119 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab982c0-f334-4924-beeb-912d170378d5" containerName="swift-ring-rebalance" Mar 09 16:20:35 crc kubenswrapper[4831]: I0309 16:20:35.995251 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab982c0-f334-4924-beeb-912d170378d5" containerName="swift-ring-rebalance" Mar 09 16:20:35 crc kubenswrapper[4831]: I0309 16:20:35.995763 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab982c0-f334-4924-beeb-912d170378d5" containerName="swift-ring-rebalance" Mar 09 16:20:35 crc kubenswrapper[4831]: I0309 16:20:35.997462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.003691 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.017431 4831 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.142455 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.142566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.142780 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8ww\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.142811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.142903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.244751 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.245175 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.245446 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm8ww\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.245541 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.245652 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.246001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.246378 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.251878 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.264577 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.284018 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm8ww\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww\") pod \"swift-proxy-67d5466c69-m5wnw\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.330015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:36 crc kubenswrapper[4831]: I0309 16:20:36.789543 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:20:36 crc kubenswrapper[4831]: W0309 16:20:36.805782 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod591a000d_158e_4105_98c1_0fd75c2aa00c.slice/crio-4f53de33b2091a2146fbff123e2a458b36f61bef7967d6dbc7d166fc95110721 WatchSource:0}: Error finding container 4f53de33b2091a2146fbff123e2a458b36f61bef7967d6dbc7d166fc95110721: Status 404 returned error can't find the container with id 4f53de33b2091a2146fbff123e2a458b36f61bef7967d6dbc7d166fc95110721 Mar 09 16:20:37 crc kubenswrapper[4831]: I0309 16:20:37.779094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerStarted","Data":"1ef24f3552ff990790e620c5aad05a2418bc9b14dadf3dcb82fbade7475fb162"} Mar 09 16:20:37 crc kubenswrapper[4831]: I0309 16:20:37.779153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerStarted","Data":"69e4cf1ca8fab537dfb3682664289a30d2b10b29537902236c2f2bd6d37b68cc"} Mar 09 16:20:37 crc kubenswrapper[4831]: I0309 16:20:37.779171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerStarted","Data":"4f53de33b2091a2146fbff123e2a458b36f61bef7967d6dbc7d166fc95110721"} Mar 09 16:20:37 crc kubenswrapper[4831]: I0309 16:20:37.779275 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:37 crc kubenswrapper[4831]: I0309 16:20:37.803370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" podStartSLOduration=2.803351833 podStartE2EDuration="2.803351833s" podCreationTimestamp="2026-03-09 16:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:20:37.803351713 +0000 UTC m=+1364.937034146" watchObservedRunningTime="2026-03-09 16:20:37.803351833 +0000 UTC m=+1364.937034256" Mar 09 16:20:38 crc kubenswrapper[4831]: I0309 16:20:38.785729 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:41 crc kubenswrapper[4831]: I0309 16:20:41.334668 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:46 crc kubenswrapper[4831]: I0309 16:20:46.332449 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.836305 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-46fwx"] Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.837592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.841973 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.842484 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.856514 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-46fwx"] Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.924936 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.924972 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.925011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.925035 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.925051 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:47 crc kubenswrapper[4831]: I0309 16:20:47.925069 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpfrt\" (UniqueName: \"kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026544 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026657 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.026779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpfrt\" (UniqueName: \"kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.027425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.027820 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.027895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.033116 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.039077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.043361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpfrt\" (UniqueName: \"kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt\") pod \"swift-ring-rebalance-debug-46fwx\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.164887 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.578983 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-46fwx"] Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.874650 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" event={"ID":"143ce599-e585-4f1d-bb4d-ca9f0e831217","Type":"ContainerStarted","Data":"630bf09edf501e2ecc449dbc1080b4372777c9967617e953df3b6d37f8d3c173"} Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.875007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" event={"ID":"143ce599-e585-4f1d-bb4d-ca9f0e831217","Type":"ContainerStarted","Data":"143aab234cab1b6b8af96bbd847d6ed2b83c38267501869ca7290145197bb7ad"} Mar 09 16:20:48 crc kubenswrapper[4831]: I0309 16:20:48.906803 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" podStartSLOduration=1.906776979 podStartE2EDuration="1.906776979s" podCreationTimestamp="2026-03-09 16:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:20:48.901254051 +0000 UTC m=+1376.034936484" watchObservedRunningTime="2026-03-09 16:20:48.906776979 +0000 UTC m=+1376.040459412" Mar 09 16:20:51 crc kubenswrapper[4831]: I0309 16:20:51.906804 4831 generic.go:334] "Generic (PLEG): container finished" podID="143ce599-e585-4f1d-bb4d-ca9f0e831217" containerID="630bf09edf501e2ecc449dbc1080b4372777c9967617e953df3b6d37f8d3c173" exitCode=0 Mar 09 16:20:51 crc kubenswrapper[4831]: I0309 16:20:51.906870 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" event={"ID":"143ce599-e585-4f1d-bb4d-ca9f0e831217","Type":"ContainerDied","Data":"630bf09edf501e2ecc449dbc1080b4372777c9967617e953df3b6d37f8d3c173"} Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.208488 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.248129 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-46fwx"] Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.255477 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-46fwx"] Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302742 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpfrt\" (UniqueName: \"kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.302766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf\") pod \"143ce599-e585-4f1d-bb4d-ca9f0e831217\" (UID: \"143ce599-e585-4f1d-bb4d-ca9f0e831217\") " Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.303198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.303606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.310728 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt" (OuterVolumeSpecName: "kube-api-access-wpfrt") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "kube-api-access-wpfrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.323457 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts" (OuterVolumeSpecName: "scripts") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.331158 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.332706 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "143ce599-e585-4f1d-bb4d-ca9f0e831217" (UID: "143ce599-e585-4f1d-bb4d-ca9f0e831217"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.388322 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg"] Mar 09 16:20:53 crc kubenswrapper[4831]: E0309 16:20:53.388806 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143ce599-e585-4f1d-bb4d-ca9f0e831217" containerName="swift-ring-rebalance" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.388830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="143ce599-e585-4f1d-bb4d-ca9f0e831217" containerName="swift-ring-rebalance" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.389001 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="143ce599-e585-4f1d-bb4d-ca9f0e831217" containerName="swift-ring-rebalance" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.389571 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404777 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404818 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/143ce599-e585-4f1d-bb4d-ca9f0e831217-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404828 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/143ce599-e585-4f1d-bb4d-ca9f0e831217-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404839 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpfrt\" (UniqueName: \"kubernetes.io/projected/143ce599-e585-4f1d-bb4d-ca9f0e831217-kube-api-access-wpfrt\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404853 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.404867 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/143ce599-e585-4f1d-bb4d-ca9f0e831217-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.416018 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg"] Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.506808 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.506953 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.507087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxtvp\" (UniqueName: \"kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.507155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.507287 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.507392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.614913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.615049 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxtvp\" (UniqueName: \"kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.615108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.615140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.615187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.615240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.616103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.616951 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.617094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.619296 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.627432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.634013 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxtvp\" (UniqueName: \"kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp\") pod \"swift-ring-rebalance-debug-mt2gg\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.636641 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143ce599-e585-4f1d-bb4d-ca9f0e831217" path="/var/lib/kubelet/pods/143ce599-e585-4f1d-bb4d-ca9f0e831217/volumes" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.765437 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.923935 4831 scope.go:117] "RemoveContainer" containerID="630bf09edf501e2ecc449dbc1080b4372777c9967617e953df3b6d37f8d3c173" Mar 09 16:20:53 crc kubenswrapper[4831]: I0309 16:20:53.923941 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-46fwx" Mar 09 16:20:54 crc kubenswrapper[4831]: I0309 16:20:54.218093 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg"] Mar 09 16:20:54 crc kubenswrapper[4831]: W0309 16:20:54.219106 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ee7c05_a961_497d_935c_896fb05825f4.slice/crio-82fb2448729e77245054434e002c3beebb732b32d054a563622d440e833b190a WatchSource:0}: Error finding container 82fb2448729e77245054434e002c3beebb732b32d054a563622d440e833b190a: Status 404 returned error can't find the container with id 82fb2448729e77245054434e002c3beebb732b32d054a563622d440e833b190a Mar 09 16:20:54 crc kubenswrapper[4831]: I0309 16:20:54.933479 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" event={"ID":"94ee7c05-a961-497d-935c-896fb05825f4","Type":"ContainerStarted","Data":"a306a6146b43f162f38acf342a3e1ccdef2360057c7c4f52b6205e66bae9e7db"} Mar 09 16:20:54 crc kubenswrapper[4831]: I0309 16:20:54.933550 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" event={"ID":"94ee7c05-a961-497d-935c-896fb05825f4","Type":"ContainerStarted","Data":"82fb2448729e77245054434e002c3beebb732b32d054a563622d440e833b190a"} Mar 09 16:20:54 crc kubenswrapper[4831]: I0309 16:20:54.955284 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" podStartSLOduration=1.955267072 podStartE2EDuration="1.955267072s" podCreationTimestamp="2026-03-09 16:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:20:54.951089063 +0000 UTC m=+1382.084771486" watchObservedRunningTime="2026-03-09 16:20:54.955267072 +0000 UTC m=+1382.088949495" Mar 09 16:20:55 crc kubenswrapper[4831]: I0309 16:20:55.948539 4831 generic.go:334] "Generic (PLEG): container finished" podID="94ee7c05-a961-497d-935c-896fb05825f4" containerID="a306a6146b43f162f38acf342a3e1ccdef2360057c7c4f52b6205e66bae9e7db" exitCode=0 Mar 09 16:20:55 crc kubenswrapper[4831]: I0309 16:20:55.948603 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" event={"ID":"94ee7c05-a961-497d-935c-896fb05825f4","Type":"ContainerDied","Data":"a306a6146b43f162f38acf342a3e1ccdef2360057c7c4f52b6205e66bae9e7db"} Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.291362 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.337107 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg"] Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.341696 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg"] Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371592 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371814 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.371858 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxtvp\" (UniqueName: \"kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp\") pod \"94ee7c05-a961-497d-935c-896fb05825f4\" (UID: \"94ee7c05-a961-497d-935c-896fb05825f4\") " Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.373200 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.373700 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.379181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp" (OuterVolumeSpecName: "kube-api-access-wxtvp") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "kube-api-access-wxtvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.390124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts" (OuterVolumeSpecName: "scripts") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.392110 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.394914 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "94ee7c05-a961-497d-935c-896fb05825f4" (UID: "94ee7c05-a961-497d-935c-896fb05825f4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476303 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476363 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxtvp\" (UniqueName: \"kubernetes.io/projected/94ee7c05-a961-497d-935c-896fb05825f4-kube-api-access-wxtvp\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476393 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94ee7c05-a961-497d-935c-896fb05825f4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476449 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476472 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94ee7c05-a961-497d-935c-896fb05825f4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.476494 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94ee7c05-a961-497d-935c-896fb05825f4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.633848 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ee7c05-a961-497d-935c-896fb05825f4" path="/var/lib/kubelet/pods/94ee7c05-a961-497d-935c-896fb05825f4/volumes" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.964133 4831 scope.go:117] "RemoveContainer" containerID="a306a6146b43f162f38acf342a3e1ccdef2360057c7c4f52b6205e66bae9e7db" Mar 09 16:20:57 crc kubenswrapper[4831]: I0309 16:20:57.964169 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt2gg" Mar 09 16:21:01 crc kubenswrapper[4831]: I0309 16:21:01.259296 4831 scope.go:117] "RemoveContainer" containerID="626893ce754906cf904b0a44f23eda0f3441541ad49b103aad6af0afe422a259" Mar 09 16:21:03 crc kubenswrapper[4831]: I0309 16:21:03.018875 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:21:03 crc kubenswrapper[4831]: I0309 16:21:03.019241 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:21:33 crc kubenswrapper[4831]: I0309 16:21:33.018975 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:21:33 crc kubenswrapper[4831]: I0309 16:21:33.019759 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:21:34 crc kubenswrapper[4831]: E0309 16:21:34.104612 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:47628->38.102.83.162:46465: write tcp 38.102.83.162:47628->38.102.83.162:46465: write: broken pipe Mar 09 16:21:57 crc kubenswrapper[4831]: E0309 16:21:57.672790 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:42730->38.102.83.162:46465: write tcp 38.102.83.162:42730->38.102.83.162:46465: write: broken pipe Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.158957 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551222-l99zr"] Mar 09 16:22:00 crc kubenswrapper[4831]: E0309 16:22:00.159831 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ee7c05-a961-497d-935c-896fb05825f4" containerName="swift-ring-rebalance" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.159846 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ee7c05-a961-497d-935c-896fb05825f4" containerName="swift-ring-rebalance" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.159987 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ee7c05-a961-497d-935c-896fb05825f4" containerName="swift-ring-rebalance" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.160592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.163912 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.165317 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.165725 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.176177 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551222-l99zr"] Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.236685 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsv9w\" (UniqueName: \"kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w\") pod \"auto-csr-approver-29551222-l99zr\" (UID: \"22226a64-f1b6-4342-858d-a388703c98f9\") " pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.337734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsv9w\" (UniqueName: \"kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w\") pod \"auto-csr-approver-29551222-l99zr\" (UID: \"22226a64-f1b6-4342-858d-a388703c98f9\") " pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.355570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsv9w\" (UniqueName: \"kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w\") pod \"auto-csr-approver-29551222-l99zr\" (UID: \"22226a64-f1b6-4342-858d-a388703c98f9\") " pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.480552 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:00 crc kubenswrapper[4831]: I0309 16:22:00.757668 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551222-l99zr"] Mar 09 16:22:01 crc kubenswrapper[4831]: I0309 16:22:01.551131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551222-l99zr" event={"ID":"22226a64-f1b6-4342-858d-a388703c98f9","Type":"ContainerStarted","Data":"8945310d95ab56df89810ffe824ead57c573b3bf2bed9ea5940331a5a6474709"} Mar 09 16:22:02 crc kubenswrapper[4831]: I0309 16:22:02.569062 4831 generic.go:334] "Generic (PLEG): container finished" podID="22226a64-f1b6-4342-858d-a388703c98f9" containerID="f4c10b6be4990aa3b7030bc586d340a12ce3b55f2661d524b21a841f5f2a716a" exitCode=0 Mar 09 16:22:02 crc kubenswrapper[4831]: I0309 16:22:02.569309 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551222-l99zr" event={"ID":"22226a64-f1b6-4342-858d-a388703c98f9","Type":"ContainerDied","Data":"f4c10b6be4990aa3b7030bc586d340a12ce3b55f2661d524b21a841f5f2a716a"} Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.019395 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.019470 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.019514 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.020162 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.020221 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3" gracePeriod=600 Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.577790 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3" exitCode=0 Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.578032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3"} Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.578107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1"} Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.578124 4831 scope.go:117] "RemoveContainer" containerID="076daca06d23b29c2390e1c6817586c0ffa3caca70c6f9a78734cb8feec3892c" Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.871821 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.989617 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsv9w\" (UniqueName: \"kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w\") pod \"22226a64-f1b6-4342-858d-a388703c98f9\" (UID: \"22226a64-f1b6-4342-858d-a388703c98f9\") " Mar 09 16:22:03 crc kubenswrapper[4831]: I0309 16:22:03.997369 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w" (OuterVolumeSpecName: "kube-api-access-xsv9w") pod "22226a64-f1b6-4342-858d-a388703c98f9" (UID: "22226a64-f1b6-4342-858d-a388703c98f9"). InnerVolumeSpecName "kube-api-access-xsv9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.091210 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsv9w\" (UniqueName: \"kubernetes.io/projected/22226a64-f1b6-4342-858d-a388703c98f9-kube-api-access-xsv9w\") on node \"crc\" DevicePath \"\"" Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.586827 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551222-l99zr" event={"ID":"22226a64-f1b6-4342-858d-a388703c98f9","Type":"ContainerDied","Data":"8945310d95ab56df89810ffe824ead57c573b3bf2bed9ea5940331a5a6474709"} Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.586875 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8945310d95ab56df89810ffe824ead57c573b3bf2bed9ea5940331a5a6474709" Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.586848 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551222-l99zr" Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.937064 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551216-p5vkf"] Mar 09 16:22:04 crc kubenswrapper[4831]: I0309 16:22:04.943079 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551216-p5vkf"] Mar 09 16:22:05 crc kubenswrapper[4831]: I0309 16:22:05.630122 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d0bed7-6225-44f3-b82b-afcbf1ea42dc" path="/var/lib/kubelet/pods/16d0bed7-6225-44f3-b82b-afcbf1ea42dc/volumes" Mar 09 16:22:11 crc kubenswrapper[4831]: E0309 16:22:11.074538 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:44220->38.102.83.162:46465: write tcp 38.102.83.162:44220->38.102.83.162:46465: write: broken pipe Mar 09 16:23:01 crc kubenswrapper[4831]: I0309 16:23:01.405380 4831 scope.go:117] "RemoveContainer" containerID="21d0fb805a97bed1eaa12abf2728c7e45f29ecaaa5bbe03c3c2635f2690d267e" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.147161 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551224-zpffv"] Mar 09 16:24:00 crc kubenswrapper[4831]: E0309 16:24:00.148150 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22226a64-f1b6-4342-858d-a388703c98f9" containerName="oc" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.148173 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="22226a64-f1b6-4342-858d-a388703c98f9" containerName="oc" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.148387 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="22226a64-f1b6-4342-858d-a388703c98f9" containerName="oc" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.149222 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.150790 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.152373 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.152577 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.161128 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551224-zpffv"] Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.222854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw86d\" (UniqueName: \"kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d\") pod \"auto-csr-approver-29551224-zpffv\" (UID: \"388afb16-3fb7-45fd-9789-0bdf0f104520\") " pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.324518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw86d\" (UniqueName: \"kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d\") pod \"auto-csr-approver-29551224-zpffv\" (UID: \"388afb16-3fb7-45fd-9789-0bdf0f104520\") " pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.359799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw86d\" (UniqueName: \"kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d\") pod \"auto-csr-approver-29551224-zpffv\" (UID: \"388afb16-3fb7-45fd-9789-0bdf0f104520\") " pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.470192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.755343 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551224-zpffv"] Mar 09 16:24:00 crc kubenswrapper[4831]: W0309 16:24:00.758780 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388afb16_3fb7_45fd_9789_0bdf0f104520.slice/crio-aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30 WatchSource:0}: Error finding container aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30: Status 404 returned error can't find the container with id aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30 Mar 09 16:24:00 crc kubenswrapper[4831]: I0309 16:24:00.760833 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:24:01 crc kubenswrapper[4831]: I0309 16:24:01.565070 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551224-zpffv" event={"ID":"388afb16-3fb7-45fd-9789-0bdf0f104520","Type":"ContainerStarted","Data":"aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30"} Mar 09 16:24:02 crc kubenswrapper[4831]: I0309 16:24:02.592464 4831 generic.go:334] "Generic (PLEG): container finished" podID="388afb16-3fb7-45fd-9789-0bdf0f104520" containerID="a116f081289a73e47427586b81bc9a5236533b4dbd14cee2a96cd1ada64b0fbb" exitCode=0 Mar 09 16:24:02 crc kubenswrapper[4831]: I0309 16:24:02.592579 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551224-zpffv" event={"ID":"388afb16-3fb7-45fd-9789-0bdf0f104520","Type":"ContainerDied","Data":"a116f081289a73e47427586b81bc9a5236533b4dbd14cee2a96cd1ada64b0fbb"} Mar 09 16:24:03 crc kubenswrapper[4831]: I0309 16:24:03.019714 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:24:03 crc kubenswrapper[4831]: I0309 16:24:03.019825 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:24:03 crc kubenswrapper[4831]: I0309 16:24:03.879202 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:03 crc kubenswrapper[4831]: I0309 16:24:03.975465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw86d\" (UniqueName: \"kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d\") pod \"388afb16-3fb7-45fd-9789-0bdf0f104520\" (UID: \"388afb16-3fb7-45fd-9789-0bdf0f104520\") " Mar 09 16:24:03 crc kubenswrapper[4831]: I0309 16:24:03.984213 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d" (OuterVolumeSpecName: "kube-api-access-pw86d") pod "388afb16-3fb7-45fd-9789-0bdf0f104520" (UID: "388afb16-3fb7-45fd-9789-0bdf0f104520"). InnerVolumeSpecName "kube-api-access-pw86d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.077954 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw86d\" (UniqueName: \"kubernetes.io/projected/388afb16-3fb7-45fd-9789-0bdf0f104520-kube-api-access-pw86d\") on node \"crc\" DevicePath \"\"" Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.608738 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551224-zpffv" Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.608817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551224-zpffv" event={"ID":"388afb16-3fb7-45fd-9789-0bdf0f104520","Type":"ContainerDied","Data":"aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30"} Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.608852 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aec418acb6066a26a3c04093e79c93fb8bb14d1515832df96f22ae37bc2a1d30" Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.935182 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551218-wg5rb"] Mar 09 16:24:04 crc kubenswrapper[4831]: I0309 16:24:04.942089 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551218-wg5rb"] Mar 09 16:24:05 crc kubenswrapper[4831]: I0309 16:24:05.627806 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae125263-a50a-4c83-b1aa-c6712ccb28ce" path="/var/lib/kubelet/pods/ae125263-a50a-4c83-b1aa-c6712ccb28ce/volumes" Mar 09 16:24:20 crc kubenswrapper[4831]: E0309 16:24:20.818541 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:45124->38.102.83.162:46465: write tcp 38.102.83.162:45124->38.102.83.162:46465: write: broken pipe Mar 09 16:24:33 crc kubenswrapper[4831]: I0309 16:24:33.018746 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:24:33 crc kubenswrapper[4831]: I0309 16:24:33.020534 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:25:01 crc kubenswrapper[4831]: I0309 16:25:01.484855 4831 scope.go:117] "RemoveContainer" containerID="5c27ca727ab5980028d7b3c92e8b386f2fd0695fcdc49f10222ed0aec5d8f251" Mar 09 16:25:01 crc kubenswrapper[4831]: I0309 16:25:01.528542 4831 scope.go:117] "RemoveContainer" containerID="96b65a332167a8e9d0dffb822bfd00aebbc522a524f06acd70608fadf11fab4b" Mar 09 16:25:03 crc kubenswrapper[4831]: I0309 16:25:03.018922 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:25:03 crc kubenswrapper[4831]: I0309 16:25:03.019004 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:25:03 crc kubenswrapper[4831]: I0309 16:25:03.019065 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:25:03 crc kubenswrapper[4831]: I0309 16:25:03.019788 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:25:03 crc kubenswrapper[4831]: I0309 16:25:03.019884 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" gracePeriod=600 Mar 09 16:25:03 crc kubenswrapper[4831]: E0309 16:25:03.147283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:25:04 crc kubenswrapper[4831]: I0309 16:25:04.140310 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" exitCode=0 Mar 09 16:25:04 crc kubenswrapper[4831]: I0309 16:25:04.140376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1"} Mar 09 16:25:04 crc kubenswrapper[4831]: I0309 16:25:04.140449 4831 scope.go:117] "RemoveContainer" containerID="8b1a974da0742e73712bed402dc8072c1e0ed820d3d8d7cddb6f9574502461b3" Mar 09 16:25:04 crc kubenswrapper[4831]: I0309 16:25:04.141117 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:25:04 crc kubenswrapper[4831]: E0309 16:25:04.141770 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:25:17 crc kubenswrapper[4831]: I0309 16:25:17.618196 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:25:17 crc kubenswrapper[4831]: E0309 16:25:17.619434 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:25:21 crc kubenswrapper[4831]: I0309 16:25:21.064271 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-htfm7"] Mar 09 16:25:21 crc kubenswrapper[4831]: I0309 16:25:21.073752 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/root-account-create-update-htfm7"] Mar 09 16:25:21 crc kubenswrapper[4831]: I0309 16:25:21.626332 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f34546-4a53-44eb-bac8-59527096a882" path="/var/lib/kubelet/pods/65f34546-4a53-44eb-bac8-59527096a882/volumes" Mar 09 16:25:28 crc kubenswrapper[4831]: I0309 16:25:28.618218 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:25:28 crc kubenswrapper[4831]: E0309 16:25:28.619543 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:25:41 crc kubenswrapper[4831]: I0309 16:25:41.617258 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:25:41 crc kubenswrapper[4831]: E0309 16:25:41.618855 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:25:54 crc kubenswrapper[4831]: I0309 16:25:54.617332 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:25:54 crc kubenswrapper[4831]: E0309 16:25:54.618621 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.149511 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551226-r5g4z"] Mar 09 16:26:00 crc kubenswrapper[4831]: E0309 16:26:00.150521 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388afb16-3fb7-45fd-9789-0bdf0f104520" containerName="oc" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.150537 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="388afb16-3fb7-45fd-9789-0bdf0f104520" containerName="oc" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.150681 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="388afb16-3fb7-45fd-9789-0bdf0f104520" containerName="oc" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.151318 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.157205 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.157464 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551226-r5g4z"] Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.157519 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.157541 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.206433 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhf2c\" (UniqueName: \"kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c\") pod \"auto-csr-approver-29551226-r5g4z\" (UID: \"718df0bb-6cad-4ce2-821d-f67fc014f745\") " pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.308477 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhf2c\" (UniqueName: \"kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c\") pod \"auto-csr-approver-29551226-r5g4z\" (UID: \"718df0bb-6cad-4ce2-821d-f67fc014f745\") " pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.335024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhf2c\" (UniqueName: \"kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c\") pod \"auto-csr-approver-29551226-r5g4z\" (UID: \"718df0bb-6cad-4ce2-821d-f67fc014f745\") " pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.489381 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:00 crc kubenswrapper[4831]: I0309 16:26:00.971486 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551226-r5g4z"] Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.589820 4831 scope.go:117] "RemoveContainer" containerID="cc14237174339607fded7afea1fec968a6b9994788cc6a30c285bcd751348893" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.624290 4831 scope.go:117] "RemoveContainer" containerID="57789b2b2e62070f312c166d1309ee9de26a37e6d216fcb2ebd44ebef8a3078e" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.633243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" event={"ID":"718df0bb-6cad-4ce2-821d-f67fc014f745","Type":"ContainerStarted","Data":"8af18d8772748b44e2adcce522126adc86ae847696b356b56226adc06a3c6296"} Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.673968 4831 scope.go:117] "RemoveContainer" containerID="2716eca2b11afd50ad5194261604fbcabda8f5bbdae2efb5c70b34cf93551b1a" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.690860 4831 scope.go:117] "RemoveContainer" containerID="be18526ff3466cf3b46fa829866b1844cd86d6ef88e6d413e8243de352574be3" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.707123 4831 scope.go:117] "RemoveContainer" containerID="201b278f47db51f04e856b419acfcebebc6a7e5aef83909afce3d9b22984e58b" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.722808 4831 scope.go:117] "RemoveContainer" containerID="58544f91bf4b5a496bac735b6044286f0a66dac13d4a8765f8bae4dbed0262c1" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.740576 4831 scope.go:117] "RemoveContainer" containerID="17acb04554823975693117326773e6242f3cd81c478eba691d6f79794557b776" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.770393 4831 scope.go:117] "RemoveContainer" containerID="ad67a0f1b6ab71ed08d7709c8a6e23463ee08b49cff997baf927da95e79da787" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.786648 4831 scope.go:117] "RemoveContainer" containerID="64f942174429c52e8fe3b9d9614ce3d1f7eaf53128b3932dbf260c41466ddcc5" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.802223 4831 scope.go:117] "RemoveContainer" containerID="7e0368cf8ceca62972dff6780845e8bd90ee13e6d9c67d3f7b5d4bd90df92ab0" Mar 09 16:26:01 crc kubenswrapper[4831]: I0309 16:26:01.819649 4831 scope.go:117] "RemoveContainer" containerID="aa152cc22d247fbdc6f6fc8230b01b11951bc9789958a70e554b479c585250ba" Mar 09 16:26:03 crc kubenswrapper[4831]: I0309 16:26:03.655973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" event={"ID":"718df0bb-6cad-4ce2-821d-f67fc014f745","Type":"ContainerStarted","Data":"31ed55511a0905a9add38b6d1dae8c24946886d3738605f0712a9cc3ebe88431"} Mar 09 16:26:03 crc kubenswrapper[4831]: I0309 16:26:03.674289 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" podStartSLOduration=1.493328639 podStartE2EDuration="3.674275658s" podCreationTimestamp="2026-03-09 16:26:00 +0000 UTC" firstStartedPulling="2026-03-09 16:26:00.975835324 +0000 UTC m=+1688.109517757" lastFinishedPulling="2026-03-09 16:26:03.156782353 +0000 UTC m=+1690.290464776" observedRunningTime="2026-03-09 16:26:03.672007404 +0000 UTC m=+1690.805689827" watchObservedRunningTime="2026-03-09 16:26:03.674275658 +0000 UTC m=+1690.807958081" Mar 09 16:26:04 crc kubenswrapper[4831]: I0309 16:26:04.664174 4831 generic.go:334] "Generic (PLEG): container finished" podID="718df0bb-6cad-4ce2-821d-f67fc014f745" containerID="31ed55511a0905a9add38b6d1dae8c24946886d3738605f0712a9cc3ebe88431" exitCode=0 Mar 09 16:26:04 crc kubenswrapper[4831]: I0309 16:26:04.664229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" event={"ID":"718df0bb-6cad-4ce2-821d-f67fc014f745","Type":"ContainerDied","Data":"31ed55511a0905a9add38b6d1dae8c24946886d3738605f0712a9cc3ebe88431"} Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.013099 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.104379 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhf2c\" (UniqueName: \"kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c\") pod \"718df0bb-6cad-4ce2-821d-f67fc014f745\" (UID: \"718df0bb-6cad-4ce2-821d-f67fc014f745\") " Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.108753 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c" (OuterVolumeSpecName: "kube-api-access-jhf2c") pod "718df0bb-6cad-4ce2-821d-f67fc014f745" (UID: "718df0bb-6cad-4ce2-821d-f67fc014f745"). InnerVolumeSpecName "kube-api-access-jhf2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.206877 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhf2c\" (UniqueName: \"kubernetes.io/projected/718df0bb-6cad-4ce2-821d-f67fc014f745-kube-api-access-jhf2c\") on node \"crc\" DevicePath \"\"" Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.681973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" event={"ID":"718df0bb-6cad-4ce2-821d-f67fc014f745","Type":"ContainerDied","Data":"8af18d8772748b44e2adcce522126adc86ae847696b356b56226adc06a3c6296"} Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.682263 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8af18d8772748b44e2adcce522126adc86ae847696b356b56226adc06a3c6296" Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.682107 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551226-r5g4z" Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.762036 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551220-pmk86"] Mar 09 16:26:06 crc kubenswrapper[4831]: I0309 16:26:06.769722 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551220-pmk86"] Mar 09 16:26:07 crc kubenswrapper[4831]: I0309 16:26:07.629441 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd860360-f0c0-4848-ac43-de53b118a65c" path="/var/lib/kubelet/pods/fd860360-f0c0-4848-ac43-de53b118a65c/volumes" Mar 09 16:26:09 crc kubenswrapper[4831]: I0309 16:26:09.617202 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:26:09 crc kubenswrapper[4831]: E0309 16:26:09.617508 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:26:20 crc kubenswrapper[4831]: I0309 16:26:20.628037 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:26:20 crc kubenswrapper[4831]: E0309 16:26:20.629107 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:26:35 crc kubenswrapper[4831]: I0309 16:26:35.617873 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:26:35 crc kubenswrapper[4831]: E0309 16:26:35.618726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:26:38 crc kubenswrapper[4831]: E0309 16:26:38.962377 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:39828->38.102.83.162:46465: write tcp 38.102.83.162:39828->38.102.83.162:46465: write: broken pipe Mar 09 16:26:47 crc kubenswrapper[4831]: I0309 16:26:47.617179 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:26:47 crc kubenswrapper[4831]: E0309 16:26:47.617957 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:26:52 crc kubenswrapper[4831]: I0309 16:26:52.053665 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5"] Mar 09 16:26:52 crc kubenswrapper[4831]: I0309 16:26:52.061783 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-create-ftmbs"] Mar 09 16:26:52 crc kubenswrapper[4831]: I0309 16:26:52.069245 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-efbe-account-create-update-mdbq5"] Mar 09 16:26:52 crc kubenswrapper[4831]: I0309 16:26:52.076680 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-create-ftmbs"] Mar 09 16:26:53 crc kubenswrapper[4831]: I0309 16:26:53.638954 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248c872a-0515-49c5-a99d-5af2c4295932" path="/var/lib/kubelet/pods/248c872a-0515-49c5-a99d-5af2c4295932/volumes" Mar 09 16:26:53 crc kubenswrapper[4831]: I0309 16:26:53.639976 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed68c1aa-2141-456d-8a99-e56de0d609e7" path="/var/lib/kubelet/pods/ed68c1aa-2141-456d-8a99-e56de0d609e7/volumes" Mar 09 16:26:57 crc kubenswrapper[4831]: E0309 16:26:57.504653 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:41308->38.102.83.162:46465: write tcp 38.102.83.162:41308->38.102.83.162:46465: write: broken pipe Mar 09 16:26:59 crc kubenswrapper[4831]: I0309 16:26:59.618201 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:26:59 crc kubenswrapper[4831]: E0309 16:26:59.618788 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:27:01 crc kubenswrapper[4831]: I0309 16:27:01.879910 4831 scope.go:117] "RemoveContainer" containerID="6048b24367d4cdff9e3164e9eefef623cb19bdb41c379a939cbcdbf6ee0d3540" Mar 09 16:27:01 crc kubenswrapper[4831]: I0309 16:27:01.901572 4831 scope.go:117] "RemoveContainer" containerID="f47df54652f825009ca5aa8dd5b88e3f9142358f0ccc1d1b277907bc44deca6e" Mar 09 16:27:01 crc kubenswrapper[4831]: I0309 16:27:01.977572 4831 scope.go:117] "RemoveContainer" containerID="9d1c6a6ef65bada3cf3d8c41d64c3ac7182a13def4fd8f9eb884c788d29088a8" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.811939 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:02 crc kubenswrapper[4831]: E0309 16:27:02.812926 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718df0bb-6cad-4ce2-821d-f67fc014f745" containerName="oc" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.813080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="718df0bb-6cad-4ce2-821d-f67fc014f745" containerName="oc" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.813439 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="718df0bb-6cad-4ce2-821d-f67fc014f745" containerName="oc" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.815137 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.820310 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.975248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.975847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6jj\" (UniqueName: \"kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:02 crc kubenswrapper[4831]: I0309 16:27:02.975889 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.079085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6jj\" (UniqueName: \"kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.079131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.079219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.079835 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.079898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.104504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6jj\" (UniqueName: \"kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj\") pod \"certified-operators-h9gpx\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.137851 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:03 crc kubenswrapper[4831]: I0309 16:27:03.590149 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:03 crc kubenswrapper[4831]: W0309 16:27:03.598024 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c888569_fa79_4337_9a73_ee32c0d17632.slice/crio-34defaa62d81cef6c927a427e3a6635188d4037cda72753dd67673b38640cb08 WatchSource:0}: Error finding container 34defaa62d81cef6c927a427e3a6635188d4037cda72753dd67673b38640cb08: Status 404 returned error can't find the container with id 34defaa62d81cef6c927a427e3a6635188d4037cda72753dd67673b38640cb08 Mar 09 16:27:04 crc kubenswrapper[4831]: I0309 16:27:04.213657 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c888569-fa79-4337-9a73-ee32c0d17632" containerID="a7f99070d0853e935892f1e8a6d1714f863044762d88518d0ba744ad46551ffc" exitCode=0 Mar 09 16:27:04 crc kubenswrapper[4831]: I0309 16:27:04.213706 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerDied","Data":"a7f99070d0853e935892f1e8a6d1714f863044762d88518d0ba744ad46551ffc"} Mar 09 16:27:04 crc kubenswrapper[4831]: I0309 16:27:04.214035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerStarted","Data":"34defaa62d81cef6c927a427e3a6635188d4037cda72753dd67673b38640cb08"} Mar 09 16:27:06 crc kubenswrapper[4831]: I0309 16:27:06.045440 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-qf6r7"] Mar 09 16:27:06 crc kubenswrapper[4831]: I0309 16:27:06.057016 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-qf6r7"] Mar 09 16:27:06 crc kubenswrapper[4831]: I0309 16:27:06.231622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerStarted","Data":"3036db81125717aacf6af00a1aee649abfde2d0a455499bf3be2594d55646229"} Mar 09 16:27:07 crc kubenswrapper[4831]: I0309 16:27:07.244110 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c888569-fa79-4337-9a73-ee32c0d17632" containerID="3036db81125717aacf6af00a1aee649abfde2d0a455499bf3be2594d55646229" exitCode=0 Mar 09 16:27:07 crc kubenswrapper[4831]: I0309 16:27:07.244193 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerDied","Data":"3036db81125717aacf6af00a1aee649abfde2d0a455499bf3be2594d55646229"} Mar 09 16:27:07 crc kubenswrapper[4831]: I0309 16:27:07.635818 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d" path="/var/lib/kubelet/pods/8f55c227-3feb-4f45-a5fc-bb9adf1f5b4d/volumes" Mar 09 16:27:09 crc kubenswrapper[4831]: I0309 16:27:09.258950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerStarted","Data":"1fd303c0625d461b9a5f4b69c4cde5c5bb21dc588481bf2112be50e0ce3cbbac"} Mar 09 16:27:09 crc kubenswrapper[4831]: I0309 16:27:09.282341 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9gpx" podStartSLOduration=3.374141129 podStartE2EDuration="7.282325939s" podCreationTimestamp="2026-03-09 16:27:02 +0000 UTC" firstStartedPulling="2026-03-09 16:27:04.215360893 +0000 UTC m=+1751.349043326" lastFinishedPulling="2026-03-09 16:27:08.123545713 +0000 UTC m=+1755.257228136" observedRunningTime="2026-03-09 16:27:09.276617526 +0000 UTC m=+1756.410299949" watchObservedRunningTime="2026-03-09 16:27:09.282325939 +0000 UTC m=+1756.416008362" Mar 09 16:27:12 crc kubenswrapper[4831]: I0309 16:27:12.030671 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-fkx4m"] Mar 09 16:27:12 crc kubenswrapper[4831]: I0309 16:27:12.040995 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-fkx4m"] Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.138302 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.139467 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.207247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.335926 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.441096 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.624652 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:27:13 crc kubenswrapper[4831]: E0309 16:27:13.624976 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:27:13 crc kubenswrapper[4831]: I0309 16:27:13.626364 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3673146f-01ef-44a4-b277-1332dd810a9d" path="/var/lib/kubelet/pods/3673146f-01ef-44a4-b277-1332dd810a9d/volumes" Mar 09 16:27:15 crc kubenswrapper[4831]: I0309 16:27:15.297266 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9gpx" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="registry-server" containerID="cri-o://1fd303c0625d461b9a5f4b69c4cde5c5bb21dc588481bf2112be50e0ce3cbbac" gracePeriod=2 Mar 09 16:27:16 crc kubenswrapper[4831]: I0309 16:27:16.311221 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c888569-fa79-4337-9a73-ee32c0d17632" containerID="1fd303c0625d461b9a5f4b69c4cde5c5bb21dc588481bf2112be50e0ce3cbbac" exitCode=0 Mar 09 16:27:16 crc kubenswrapper[4831]: I0309 16:27:16.311285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerDied","Data":"1fd303c0625d461b9a5f4b69c4cde5c5bb21dc588481bf2112be50e0ce3cbbac"} Mar 09 16:27:16 crc kubenswrapper[4831]: I0309 16:27:16.867974 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.001999 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities\") pod \"6c888569-fa79-4337-9a73-ee32c0d17632\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.002212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6jj\" (UniqueName: \"kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj\") pod \"6c888569-fa79-4337-9a73-ee32c0d17632\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.002239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content\") pod \"6c888569-fa79-4337-9a73-ee32c0d17632\" (UID: \"6c888569-fa79-4337-9a73-ee32c0d17632\") " Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.003105 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities" (OuterVolumeSpecName: "utilities") pod "6c888569-fa79-4337-9a73-ee32c0d17632" (UID: "6c888569-fa79-4337-9a73-ee32c0d17632"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.011752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj" (OuterVolumeSpecName: "kube-api-access-xg6jj") pod "6c888569-fa79-4337-9a73-ee32c0d17632" (UID: "6c888569-fa79-4337-9a73-ee32c0d17632"). InnerVolumeSpecName "kube-api-access-xg6jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.055349 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c888569-fa79-4337-9a73-ee32c0d17632" (UID: "6c888569-fa79-4337-9a73-ee32c0d17632"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.104078 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.104108 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6jj\" (UniqueName: \"kubernetes.io/projected/6c888569-fa79-4337-9a73-ee32c0d17632-kube-api-access-xg6jj\") on node \"crc\" DevicePath \"\"" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.104117 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c888569-fa79-4337-9a73-ee32c0d17632-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.324453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gpx" event={"ID":"6c888569-fa79-4337-9a73-ee32c0d17632","Type":"ContainerDied","Data":"34defaa62d81cef6c927a427e3a6635188d4037cda72753dd67673b38640cb08"} Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.324502 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gpx" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.324552 4831 scope.go:117] "RemoveContainer" containerID="1fd303c0625d461b9a5f4b69c4cde5c5bb21dc588481bf2112be50e0ce3cbbac" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.356778 4831 scope.go:117] "RemoveContainer" containerID="3036db81125717aacf6af00a1aee649abfde2d0a455499bf3be2594d55646229" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.377093 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.395195 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9gpx"] Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.403969 4831 scope.go:117] "RemoveContainer" containerID="a7f99070d0853e935892f1e8a6d1714f863044762d88518d0ba744ad46551ffc" Mar 09 16:27:17 crc kubenswrapper[4831]: I0309 16:27:17.628538 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" path="/var/lib/kubelet/pods/6c888569-fa79-4337-9a73-ee32c0d17632/volumes" Mar 09 16:27:25 crc kubenswrapper[4831]: I0309 16:27:25.617566 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:27:25 crc kubenswrapper[4831]: E0309 16:27:25.618296 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:27:36 crc kubenswrapper[4831]: I0309 16:27:36.618586 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:27:36 crc kubenswrapper[4831]: E0309 16:27:36.619928 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:27:49 crc kubenswrapper[4831]: I0309 16:27:49.617799 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:27:49 crc kubenswrapper[4831]: E0309 16:27:49.618493 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:27:54 crc kubenswrapper[4831]: I0309 16:27:54.051311 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf"] Mar 09 16:27:54 crc kubenswrapper[4831]: I0309 16:27:54.062040 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-db-create-krp95"] Mar 09 16:27:54 crc kubenswrapper[4831]: I0309 16:27:54.082192 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-db-create-krp95"] Mar 09 16:27:54 crc kubenswrapper[4831]: I0309 16:27:54.095573 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-71a7-account-create-update-mbwsf"] Mar 09 16:27:55 crc kubenswrapper[4831]: I0309 16:27:55.629570 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b2ff70-bfd2-41cd-a0f7-3922fb99b379" path="/var/lib/kubelet/pods/85b2ff70-bfd2-41cd-a0f7-3922fb99b379/volumes" Mar 09 16:27:55 crc kubenswrapper[4831]: I0309 16:27:55.630701 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa19489-9820-4ed2-8e45-d829182a9b07" path="/var/lib/kubelet/pods/9aa19489-9820-4ed2-8e45-d829182a9b07/volumes" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.146552 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551228-gs4b5"] Mar 09 16:28:00 crc kubenswrapper[4831]: E0309 16:28:00.147068 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="extract-content" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.147084 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="extract-content" Mar 09 16:28:00 crc kubenswrapper[4831]: E0309 16:28:00.147118 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="extract-utilities" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.147126 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="extract-utilities" Mar 09 16:28:00 crc kubenswrapper[4831]: E0309 16:28:00.147136 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="registry-server" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.147144 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="registry-server" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.147271 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c888569-fa79-4337-9a73-ee32c0d17632" containerName="registry-server" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.147824 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.150279 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.150356 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.150685 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.166937 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551228-gs4b5"] Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.349181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7xv\" (UniqueName: \"kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv\") pod \"auto-csr-approver-29551228-gs4b5\" (UID: \"20ca7be6-7949-429b-8aae-074a0b28c000\") " pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.450640 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7xv\" (UniqueName: \"kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv\") pod \"auto-csr-approver-29551228-gs4b5\" (UID: \"20ca7be6-7949-429b-8aae-074a0b28c000\") " pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.469563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7xv\" (UniqueName: \"kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv\") pod \"auto-csr-approver-29551228-gs4b5\" (UID: \"20ca7be6-7949-429b-8aae-074a0b28c000\") " pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.472237 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:00 crc kubenswrapper[4831]: I0309 16:28:00.918330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551228-gs4b5"] Mar 09 16:28:00 crc kubenswrapper[4831]: W0309 16:28:00.937202 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ca7be6_7949_429b_8aae_074a0b28c000.slice/crio-df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4 WatchSource:0}: Error finding container df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4: Status 404 returned error can't find the container with id df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4 Mar 09 16:28:01 crc kubenswrapper[4831]: I0309 16:28:01.667302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" event={"ID":"20ca7be6-7949-429b-8aae-074a0b28c000","Type":"ContainerStarted","Data":"df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4"} Mar 09 16:28:02 crc kubenswrapper[4831]: I0309 16:28:02.055704 4831 scope.go:117] "RemoveContainer" containerID="106fd2db5a102ff27824b3bd79bdbb7681b1fbcbcdf970115375e28974c3867c" Mar 09 16:28:02 crc kubenswrapper[4831]: I0309 16:28:02.082095 4831 scope.go:117] "RemoveContainer" containerID="144ab3d27c412cb08d01b60095af350326339bb75d2c84434196eb25b8dcaf9f" Mar 09 16:28:02 crc kubenswrapper[4831]: I0309 16:28:02.099962 4831 scope.go:117] "RemoveContainer" containerID="4199b7a2ccab904a1c5a4d2edc36cf27a1b80580507e80a102596a2d7ad0b319" Mar 09 16:28:02 crc kubenswrapper[4831]: I0309 16:28:02.126610 4831 scope.go:117] "RemoveContainer" containerID="4d1151560c5239f77ef053ccc623a43ac93aafab37c00ca946c459cb485439d9" Mar 09 16:28:03 crc kubenswrapper[4831]: I0309 16:28:03.685288 4831 generic.go:334] "Generic (PLEG): container finished" podID="20ca7be6-7949-429b-8aae-074a0b28c000" containerID="d79ad13a6052a98b198ba093f6a7ea885909d266af4642f977b2e31dc23db442" exitCode=0 Mar 09 16:28:03 crc kubenswrapper[4831]: I0309 16:28:03.685898 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" event={"ID":"20ca7be6-7949-429b-8aae-074a0b28c000","Type":"ContainerDied","Data":"d79ad13a6052a98b198ba093f6a7ea885909d266af4642f977b2e31dc23db442"} Mar 09 16:28:04 crc kubenswrapper[4831]: I0309 16:28:04.617223 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:28:04 crc kubenswrapper[4831]: E0309 16:28:04.617457 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:28:04 crc kubenswrapper[4831]: I0309 16:28:04.975318 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.033098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml7xv\" (UniqueName: \"kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv\") pod \"20ca7be6-7949-429b-8aae-074a0b28c000\" (UID: \"20ca7be6-7949-429b-8aae-074a0b28c000\") " Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.037795 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv" (OuterVolumeSpecName: "kube-api-access-ml7xv") pod "20ca7be6-7949-429b-8aae-074a0b28c000" (UID: "20ca7be6-7949-429b-8aae-074a0b28c000"). InnerVolumeSpecName "kube-api-access-ml7xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.134974 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml7xv\" (UniqueName: \"kubernetes.io/projected/20ca7be6-7949-429b-8aae-074a0b28c000-kube-api-access-ml7xv\") on node \"crc\" DevicePath \"\"" Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.708237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" event={"ID":"20ca7be6-7949-429b-8aae-074a0b28c000","Type":"ContainerDied","Data":"df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4"} Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.708624 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df51ecdd1a4ca97512361a45439352c32a227e7f2b067f1edab49d29c4f230f4" Mar 09 16:28:05 crc kubenswrapper[4831]: I0309 16:28:05.708627 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551228-gs4b5" Mar 09 16:28:06 crc kubenswrapper[4831]: I0309 16:28:06.024265 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551222-l99zr"] Mar 09 16:28:06 crc kubenswrapper[4831]: I0309 16:28:06.030027 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551222-l99zr"] Mar 09 16:28:07 crc kubenswrapper[4831]: I0309 16:28:07.633548 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22226a64-f1b6-4342-858d-a388703c98f9" path="/var/lib/kubelet/pods/22226a64-f1b6-4342-858d-a388703c98f9/volumes" Mar 09 16:28:16 crc kubenswrapper[4831]: I0309 16:28:16.617056 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:28:16 crc kubenswrapper[4831]: E0309 16:28:16.617801 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:28:27 crc kubenswrapper[4831]: I0309 16:28:27.618389 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:28:27 crc kubenswrapper[4831]: E0309 16:28:27.619443 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:28:42 crc kubenswrapper[4831]: I0309 16:28:42.617245 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:28:42 crc kubenswrapper[4831]: E0309 16:28:42.618180 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:28:55 crc kubenswrapper[4831]: I0309 16:28:55.617834 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:28:55 crc kubenswrapper[4831]: E0309 16:28:55.618621 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:29:02 crc kubenswrapper[4831]: I0309 16:29:02.229018 4831 scope.go:117] "RemoveContainer" containerID="f4c10b6be4990aa3b7030bc586d340a12ce3b55f2661d524b21a841f5f2a716a" Mar 09 16:29:07 crc kubenswrapper[4831]: E0309 16:29:07.835688 4831 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.162:49412->38.102.83.162:46465: read tcp 38.102.83.162:49412->38.102.83.162:46465: read: connection reset by peer Mar 09 16:29:08 crc kubenswrapper[4831]: I0309 16:29:08.617892 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:29:08 crc kubenswrapper[4831]: E0309 16:29:08.618751 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:29:20 crc kubenswrapper[4831]: I0309 16:29:20.618356 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:29:20 crc kubenswrapper[4831]: E0309 16:29:20.619573 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:29:23 crc kubenswrapper[4831]: E0309 16:29:23.844940 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:48520->38.102.83.162:46465: write tcp 38.102.83.162:48520->38.102.83.162:46465: write: broken pipe Mar 09 16:29:34 crc kubenswrapper[4831]: I0309 16:29:34.617840 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:29:34 crc kubenswrapper[4831]: E0309 16:29:34.618481 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:29:45 crc kubenswrapper[4831]: I0309 16:29:45.618270 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:29:45 crc kubenswrapper[4831]: E0309 16:29:45.618990 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:29:58 crc kubenswrapper[4831]: I0309 16:29:58.617418 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:29:58 crc kubenswrapper[4831]: E0309 16:29:58.618211 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.158772 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551230-sz9vm"] Mar 09 16:30:00 crc kubenswrapper[4831]: E0309 16:30:00.159610 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ca7be6-7949-429b-8aae-074a0b28c000" containerName="oc" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.159631 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ca7be6-7949-429b-8aae-074a0b28c000" containerName="oc" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.159857 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ca7be6-7949-429b-8aae-074a0b28c000" containerName="oc" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.160652 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.164541 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.165208 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.168790 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.171868 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8"] Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.172833 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.174896 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.174900 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.178628 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551230-sz9vm"] Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.196292 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8"] Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.206217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.206265 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8z94\" (UniqueName: \"kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.206294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kp5n\" (UniqueName: \"kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n\") pod \"auto-csr-approver-29551230-sz9vm\" (UID: \"01c2aa8a-7df6-4198-95c3-bdb7c661a845\") " pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.206641 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.307809 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.307894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.307955 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8z94\" (UniqueName: \"kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.308005 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kp5n\" (UniqueName: \"kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n\") pod \"auto-csr-approver-29551230-sz9vm\" (UID: \"01c2aa8a-7df6-4198-95c3-bdb7c661a845\") " pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.309762 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.314357 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.331284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8z94\" (UniqueName: \"kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94\") pod \"collect-profiles-29551230-65lk8\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.331587 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kp5n\" (UniqueName: \"kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n\") pod \"auto-csr-approver-29551230-sz9vm\" (UID: \"01c2aa8a-7df6-4198-95c3-bdb7c661a845\") " pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.483499 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.500444 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:00 crc kubenswrapper[4831]: I0309 16:30:00.991578 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8"] Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.043846 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551230-sz9vm"] Mar 09 16:30:01 crc kubenswrapper[4831]: W0309 16:30:01.046509 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01c2aa8a_7df6_4198_95c3_bdb7c661a845.slice/crio-dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3 WatchSource:0}: Error finding container dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3: Status 404 returned error can't find the container with id dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3 Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.049869 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.846705 4831 generic.go:334] "Generic (PLEG): container finished" podID="bfec198c-a320-4c26-b041-88b8d7bc9a17" containerID="784403feec7222648dfb0233781aa986b6b6c7fd2366dad4579f39897a81129a" exitCode=0 Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.846772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" event={"ID":"bfec198c-a320-4c26-b041-88b8d7bc9a17","Type":"ContainerDied","Data":"784403feec7222648dfb0233781aa986b6b6c7fd2366dad4579f39897a81129a"} Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.847189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" event={"ID":"bfec198c-a320-4c26-b041-88b8d7bc9a17","Type":"ContainerStarted","Data":"92704f8d4198a9f2e1eba459ab71e91479f1e0987045d5b0a0aef8ef766e2d73"} Mar 09 16:30:01 crc kubenswrapper[4831]: I0309 16:30:01.848915 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" event={"ID":"01c2aa8a-7df6-4198-95c3-bdb7c661a845","Type":"ContainerStarted","Data":"dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3"} Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.160977 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.258756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume\") pod \"bfec198c-a320-4c26-b041-88b8d7bc9a17\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.258815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8z94\" (UniqueName: \"kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94\") pod \"bfec198c-a320-4c26-b041-88b8d7bc9a17\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.258896 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume\") pod \"bfec198c-a320-4c26-b041-88b8d7bc9a17\" (UID: \"bfec198c-a320-4c26-b041-88b8d7bc9a17\") " Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.259997 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfec198c-a320-4c26-b041-88b8d7bc9a17" (UID: "bfec198c-a320-4c26-b041-88b8d7bc9a17"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.265569 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfec198c-a320-4c26-b041-88b8d7bc9a17" (UID: "bfec198c-a320-4c26-b041-88b8d7bc9a17"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.265673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94" (OuterVolumeSpecName: "kube-api-access-s8z94") pod "bfec198c-a320-4c26-b041-88b8d7bc9a17" (UID: "bfec198c-a320-4c26-b041-88b8d7bc9a17"). InnerVolumeSpecName "kube-api-access-s8z94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.360917 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfec198c-a320-4c26-b041-88b8d7bc9a17-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.360951 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8z94\" (UniqueName: \"kubernetes.io/projected/bfec198c-a320-4c26-b041-88b8d7bc9a17-kube-api-access-s8z94\") on node \"crc\" DevicePath \"\"" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.360962 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfec198c-a320-4c26-b041-88b8d7bc9a17-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.868227 4831 generic.go:334] "Generic (PLEG): container finished" podID="01c2aa8a-7df6-4198-95c3-bdb7c661a845" containerID="2f9ccb3260343621d9d5081fd934e3e76b6371f6a2c90c82f1789790ad814d52" exitCode=0 Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.868311 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" event={"ID":"01c2aa8a-7df6-4198-95c3-bdb7c661a845","Type":"ContainerDied","Data":"2f9ccb3260343621d9d5081fd934e3e76b6371f6a2c90c82f1789790ad814d52"} Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.870063 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" event={"ID":"bfec198c-a320-4c26-b041-88b8d7bc9a17","Type":"ContainerDied","Data":"92704f8d4198a9f2e1eba459ab71e91479f1e0987045d5b0a0aef8ef766e2d73"} Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.870103 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92704f8d4198a9f2e1eba459ab71e91479f1e0987045d5b0a0aef8ef766e2d73" Mar 09 16:30:03 crc kubenswrapper[4831]: I0309 16:30:03.870300 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551230-65lk8" Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.174793 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.184619 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kp5n\" (UniqueName: \"kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n\") pod \"01c2aa8a-7df6-4198-95c3-bdb7c661a845\" (UID: \"01c2aa8a-7df6-4198-95c3-bdb7c661a845\") " Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.191355 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n" (OuterVolumeSpecName: "kube-api-access-4kp5n") pod "01c2aa8a-7df6-4198-95c3-bdb7c661a845" (UID: "01c2aa8a-7df6-4198-95c3-bdb7c661a845"). InnerVolumeSpecName "kube-api-access-4kp5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.286886 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kp5n\" (UniqueName: \"kubernetes.io/projected/01c2aa8a-7df6-4198-95c3-bdb7c661a845-kube-api-access-4kp5n\") on node \"crc\" DevicePath \"\"" Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.886777 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" event={"ID":"01c2aa8a-7df6-4198-95c3-bdb7c661a845","Type":"ContainerDied","Data":"dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3"} Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.887067 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6c1d74455743361d6d3eb07dfa3b3941d4cfdb5b5e108a128c6b4d5d62c9e3" Mar 09 16:30:05 crc kubenswrapper[4831]: I0309 16:30:05.886877 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551230-sz9vm" Mar 09 16:30:06 crc kubenswrapper[4831]: I0309 16:30:06.251237 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551224-zpffv"] Mar 09 16:30:06 crc kubenswrapper[4831]: I0309 16:30:06.256848 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551224-zpffv"] Mar 09 16:30:07 crc kubenswrapper[4831]: I0309 16:30:07.627314 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388afb16-3fb7-45fd-9789-0bdf0f104520" path="/var/lib/kubelet/pods/388afb16-3fb7-45fd-9789-0bdf0f104520/volumes" Mar 09 16:30:11 crc kubenswrapper[4831]: I0309 16:30:11.618412 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:30:11 crc kubenswrapper[4831]: I0309 16:30:11.943904 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e"} Mar 09 16:30:13 crc kubenswrapper[4831]: E0309 16:30:13.049599 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:36584->38.102.83.162:46465: write tcp 38.102.83.162:36584->38.102.83.162:46465: write: broken pipe Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.630602 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m"] Mar 09 16:31:00 crc kubenswrapper[4831]: E0309 16:31:00.632043 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfec198c-a320-4c26-b041-88b8d7bc9a17" containerName="collect-profiles" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.632065 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfec198c-a320-4c26-b041-88b8d7bc9a17" containerName="collect-profiles" Mar 09 16:31:00 crc kubenswrapper[4831]: E0309 16:31:00.632085 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c2aa8a-7df6-4198-95c3-bdb7c661a845" containerName="oc" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.632095 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c2aa8a-7df6-4198-95c3-bdb7c661a845" containerName="oc" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.632296 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfec198c-a320-4c26-b041-88b8d7bc9a17" containerName="collect-profiles" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.632326 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c2aa8a-7df6-4198-95c3-bdb7c661a845" containerName="oc" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.633045 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.640581 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.640597 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.651965 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m"] Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.686720 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.686850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.686907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.686952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.687011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.687070 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756cl\" (UniqueName: \"kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.788752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789315 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789456 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756cl\" (UniqueName: \"kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789717 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.789901 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.790390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.790443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.794411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.795301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.832940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756cl\" (UniqueName: \"kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl\") pod \"swift-ring-rebalance-debug-xbp9m\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:00 crc kubenswrapper[4831]: I0309 16:31:00.976023 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.262976 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m"] Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.324718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" event={"ID":"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f","Type":"ContainerStarted","Data":"09e8faf9f0683d17319b8e765e113622595767f1fda3c1169c83591d3402fab6"} Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.721134 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.730956 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.743696 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.750077 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.753596 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.766171 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-cache\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825242 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-etc-swift\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gvw\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-kube-api-access-58gvw\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825327 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpl2t\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-kube-api-access-gpl2t\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-lock\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-etc-swift\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-cache\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.825680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-lock\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926135 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpl2t\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-kube-api-access-gpl2t\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-lock\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-etc-swift\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-cache\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-lock\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-cache\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926321 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-etc-swift\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58gvw\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-kube-api-access-58gvw\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-lock\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926807 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") device mount path \"/mnt/openstack/pv11\"" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.926885 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") device mount path \"/mnt/openstack/pv07\"" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.927066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/594b7519-72cc-45b8-ab1a-2bcac1e8f514-cache\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.927524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-cache\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.927598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0232625e-8378-498f-9689-2ba54adc50ed-lock\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.943463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-etc-swift\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.944357 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-etc-swift\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.945185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.945574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gvw\" (UniqueName: \"kubernetes.io/projected/0232625e-8378-498f-9689-2ba54adc50ed-kube-api-access-58gvw\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.945983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpl2t\" (UniqueName: \"kubernetes.io/projected/594b7519-72cc-45b8-ab1a-2bcac1e8f514-kube-api-access-gpl2t\") pod \"swift-storage-1\" (UID: \"594b7519-72cc-45b8-ab1a-2bcac1e8f514\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:01 crc kubenswrapper[4831]: I0309 16:31:01.948827 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-2\" (UID: \"0232625e-8378-498f-9689-2ba54adc50ed\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.055615 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.064731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.335977 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" event={"ID":"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f","Type":"ContainerStarted","Data":"b586d3d078dc2463c14548774b22f585d8b393a971888bb3146c57ee121957c3"} Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.341529 4831 scope.go:117] "RemoveContainer" containerID="a116f081289a73e47427586b81bc9a5236533b4dbd14cee2a96cd1ada64b0fbb" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.356713 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" podStartSLOduration=2.356689433 podStartE2EDuration="2.356689433s" podCreationTimestamp="2026-03-09 16:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:02.352509324 +0000 UTC m=+1989.486191767" watchObservedRunningTime="2026-03-09 16:31:02.356689433 +0000 UTC m=+1989.490371856" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.557431 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 16:31:02 crc kubenswrapper[4831]: W0309 16:31:02.565378 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod594b7519_72cc_45b8_ab1a_2bcac1e8f514.slice/crio-9c177052a6e8280c68511e7a8bc6e8294a8d3ccb5c2355196d021c5fca9ccd84 WatchSource:0}: Error finding container 9c177052a6e8280c68511e7a8bc6e8294a8d3ccb5c2355196d021c5fca9ccd84: Status 404 returned error can't find the container with id 9c177052a6e8280c68511e7a8bc6e8294a8d3ccb5c2355196d021c5fca9ccd84 Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.641740 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-zp7j5"] Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.644284 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.650546 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 16:31:02 crc kubenswrapper[4831]: W0309 16:31:02.666450 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0232625e_8378_498f_9689_2ba54adc50ed.slice/crio-d600e4ddaafa63d8addcb45b780b8eba419d2c9f06edcab113ac7ab696726742 WatchSource:0}: Error finding container d600e4ddaafa63d8addcb45b780b8eba419d2c9f06edcab113ac7ab696726742: Status 404 returned error can't find the container with id d600e4ddaafa63d8addcb45b780b8eba419d2c9f06edcab113ac7ab696726742 Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.705942 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-zp7j5"] Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.744756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-log-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.744813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfwj2\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-kube-api-access-hfwj2\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.744854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-run-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.744886 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcabb16-8d97-47d0-9e50-980536b98a36-config-data\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.744906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-etc-swift\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.846188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-log-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.846455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfwj2\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-kube-api-access-hfwj2\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.846499 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-run-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.846515 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcabb16-8d97-47d0-9e50-980536b98a36-config-data\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.846534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-etc-swift\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.847185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-log-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.848175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bcabb16-8d97-47d0-9e50-980536b98a36-run-httpd\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.855104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcabb16-8d97-47d0-9e50-980536b98a36-config-data\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.855387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-etc-swift\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.863183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfwj2\" (UniqueName: \"kubernetes.io/projected/4bcabb16-8d97-47d0-9e50-980536b98a36-kube-api-access-hfwj2\") pod \"swift-proxy-76c998454c-zp7j5\" (UID: \"4bcabb16-8d97-47d0-9e50-980536b98a36\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:02 crc kubenswrapper[4831]: I0309 16:31:02.980748 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.261073 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-zp7j5"] Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.368012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" event={"ID":"4bcabb16-8d97-47d0-9e50-980536b98a36","Type":"ContainerStarted","Data":"338a7bba965af34e4c80689e32c61dffe9e0ac9a8cce81954302be965e3d3da8"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.376609 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"684e979da6603f12437cb3a6ba5df412f3a3b721c1c3c9bc483d419623c3febf"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.376646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"94550e162eb704f055516c0488b0fb6229ac3b7de1c2ecc7a7f49c0d02b589c4"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.376655 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"b37d5ec6e4fd59b9df89c0c4471259c5cc611beaca828e4a36ab98739999d7ce"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.376662 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"d600e4ddaafa63d8addcb45b780b8eba419d2c9f06edcab113ac7ab696726742"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.392487 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"74160d82aa2936c5fde9e8246d6476c9bde3a04f9cdcd4a66d7381755f3b3bb9"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.392744 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"f9b8a7b2d868c92129bfdd215c5586cb14e3488aa0ee49e33cdbd979072f512f"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.392755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"622fb1166f868892feaa34bbe73e1d555039f721b49e56fd0b4eefbfed6f5b63"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.392763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"dba5ed888e5460615a75409c756bbdcc058dca244a3fea62a4ad3f49c0efb959"} Mar 09 16:31:03 crc kubenswrapper[4831]: I0309 16:31:03.392774 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"9c177052a6e8280c68511e7a8bc6e8294a8d3ccb5c2355196d021c5fca9ccd84"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404338 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"3d91fb47fa3f6bc97af891f393bed0997884d7809e469267aa076bb1a4cc560d"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"dafe6d8992f45f192c45fa2dca96883cf8b69287bfc4db00eb48eafd6d019cbf"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"0e9634bea302b05191f65fbe6821b5b9370f81f25d820237a3f71610d5f260ca"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"fb06271352c69f0c0bd3d14cb70e39b6d30921d71430ece5f4aae6c583e90d36"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404753 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"592603af639ad945021ee64e1b84ca6e8987f8964eb5c103cfcec875b795f9d9"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.404763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"bb51e1366a59eff28ec082327481027042ce2e56c479d044b8ef8f8ff630be6b"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.406634 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" event={"ID":"4bcabb16-8d97-47d0-9e50-980536b98a36","Type":"ContainerStarted","Data":"6d16f08fc86aee05305575c784e68d456668dc046c5f58cfa641cfe3392d2610"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.406690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" event={"ID":"4bcabb16-8d97-47d0-9e50-980536b98a36","Type":"ContainerStarted","Data":"f3f379776981d6165faf4cc5a855d3d84ae7e38f986846f215847e00a6c870e5"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.406739 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"5304499660ef3d6cf5ecaf5aa75a49465895f0838b79ca71a49b7f42b1f4a564"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"dae28420d83fbb301adfe36cd580ac427e8f748998476d62a7bca716f44b6e2b"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"aaab8b1322ec5cd08d6ab24fca7af78f5bcfc7b8edbe83980302fa7a729cec51"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"83ed6c05b9a85743620b9167f66c5c06fc3bd6210bca1bff71e862e830a90aa7"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"dbc0d35ed44319f1772c501a30ea623eff4f9616e6fb1ce01d82d8da88a017e1"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.418180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"b4d429df4e65a0a573e1f7ea7230201251de368203a34a590becc6374641fa07"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.420061 4831 generic.go:334] "Generic (PLEG): container finished" podID="f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" containerID="b586d3d078dc2463c14548774b22f585d8b393a971888bb3146c57ee121957c3" exitCode=0 Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.420115 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" event={"ID":"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f","Type":"ContainerDied","Data":"b586d3d078dc2463c14548774b22f585d8b393a971888bb3146c57ee121957c3"} Mar 09 16:31:04 crc kubenswrapper[4831]: I0309 16:31:04.429515 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" podStartSLOduration=2.429495094 podStartE2EDuration="2.429495094s" podCreationTimestamp="2026-03-09 16:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:04.425280013 +0000 UTC m=+1991.558962437" watchObservedRunningTime="2026-03-09 16:31:04.429495094 +0000 UTC m=+1991.563177517" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.436426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"cf71bffd09a7047a4e30804c6900106bd5990c0be915d3fbeaac6448e2490e4d"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.436748 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"d1fcef62d12fea4f40e0b64ef28504b1a698ecfac3ea5e493fd0747765cbbe99"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.436759 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"905fd221959ed2672c0638f77b62586640b46c4d0769264b5290bf53c1fef4e0"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.436768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"035afd5075167280dd124a7f5dab87dcd0a8cdb833f98df2d52508997aceee79"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.436776 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"594b7519-72cc-45b8-ab1a-2bcac1e8f514","Type":"ContainerStarted","Data":"c89907acfd00784ab00d1286ced9bfc72453c1c60c88ab8fcba0806b2f4d4aa6"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"c7a8fd385b689735f32b8c18c2e858c1208262de748e3e8b8a702452329888aa"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443764 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"714cde1bf60a4ea8c373894f0865232bcdc9de62a6560802d1b3a8b5cffeeb43"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443774 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"6f7ac996bc9ae745289440c9ca9086cd196e9c1dbf85b7c886faf5349728619d"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443782 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"b086eb05b8cc12a23c112663f287e10fb51afce0a71f4f540a0d8138f7227317"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"cb79b8ad6c804ae514dd6ff02c341bb0e9fb5d5042569be386fce16ffc5c4f45"} Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.443902 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.474219 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=5.474197436 podStartE2EDuration="5.474197436s" podCreationTimestamp="2026-03-09 16:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:05.466861007 +0000 UTC m=+1992.600543440" watchObservedRunningTime="2026-03-09 16:31:05.474197436 +0000 UTC m=+1992.607879859" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.542891 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pnsn2"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.549840 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pnsn2"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.569894 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-sltr4"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.571018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.581643 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-sltr4"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.603820 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9kf\" (UniqueName: \"kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.603883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.603939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.603956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.603973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.604008 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.640548 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab982c0-f334-4924-beeb-912d170378d5" path="/var/lib/kubelet/pods/1ab982c0-f334-4924-beeb-912d170378d5/volumes" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705360 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705413 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9kf\" (UniqueName: \"kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.705481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.706115 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.706240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.706563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.713056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.713594 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.720737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9kf\" (UniqueName: \"kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf\") pod \"swift-ring-rebalance-sltr4\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.788277 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806066 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806124 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756cl\" (UniqueName: \"kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806205 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806270 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806301 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.806340 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf\") pod \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\" (UID: \"f71fb0d7-b1fc-463d-ae1d-f53497e47a4f\") " Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.807886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.808177 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.825870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.829273 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl" (OuterVolumeSpecName: "kube-api-access-756cl") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "kube-api-access-756cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.831603 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.835974 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.836886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts" (OuterVolumeSpecName: "scripts") pod "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" (UID: "f71fb0d7-b1fc-463d-ae1d-f53497e47a4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.842152 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m"] Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.893862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.910701 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.910963 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.911062 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.911163 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.911241 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.911320 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756cl\" (UniqueName: \"kubernetes.io/projected/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f-kube-api-access-756cl\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.964250 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:05 crc kubenswrapper[4831]: E0309 16:31:05.964570 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" containerName="swift-ring-rebalance" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.964582 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" containerName="swift-ring-rebalance" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.965386 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" containerName="swift-ring-rebalance" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.970515 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:05 crc kubenswrapper[4831]: I0309 16:31:05.995581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015233 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrzh\" (UniqueName: \"kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015295 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015343 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.015363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117064 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117377 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117454 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117510 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrzh\" (UniqueName: \"kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.117785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.118682 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.119017 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.125710 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.132024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.146487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrzh\" (UniqueName: \"kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh\") pod \"swift-ring-rebalance-debug-7lccs\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: W0309 16:31:06.181768 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4aa9c32_5e03_4e35_b545_2d6a820ebcb1.slice/crio-333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e WatchSource:0}: Error finding container 333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e: Status 404 returned error can't find the container with id 333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.184271 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-sltr4"] Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.296331 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.467529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" event={"ID":"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1","Type":"ContainerStarted","Data":"9fa8511fb5d02d9dbaa4b2695ab278f00fceb6246a341e62280fa237a70f21ce"} Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.467574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" event={"ID":"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1","Type":"ContainerStarted","Data":"333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e"} Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.559184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"0232625e-8378-498f-9689-2ba54adc50ed","Type":"ContainerStarted","Data":"4922b1eb1104e101b458a7d8f82a9e5fa8a82bf1acb48087a21d211778ed7dcd"} Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.566890 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" podStartSLOduration=1.566852866 podStartE2EDuration="1.566852866s" podCreationTimestamp="2026-03-09 16:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:06.514190024 +0000 UTC m=+1993.647872457" watchObservedRunningTime="2026-03-09 16:31:06.566852866 +0000 UTC m=+1993.700535289" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.575340 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.584628 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbp9m" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.593085 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e8faf9f0683d17319b8e765e113622595767f1fda3c1169c83591d3402fab6" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.610042 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.637747 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=6.637722427 podStartE2EDuration="6.637722427s" podCreationTimestamp="2026-03-09 16:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:06.628072072 +0000 UTC m=+1993.761754495" watchObservedRunningTime="2026-03-09 16:31:06.637722427 +0000 UTC m=+1993.771404850" Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.655728 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656328 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-server" containerID="cri-o://0d0615e21966787026795ed21aec5a8b9150ef6e98421a2dea83c8969206ad36" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656726 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-sharder" containerID="cri-o://03ce249750b91f5a4458b3f2fcebe229ac19f428bff35a9577f87ce0c686fcb8" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656790 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="swift-recon-cron" containerID="cri-o://cd92eef9b48e2fbc409ddacc00af540edf1d62d35e807b6d8fbd4de17e75fbcf" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656864 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="rsync" containerID="cri-o://5bb292b424172609a8fdd92d252e1a716d5502e9e585df3b898f566dc4a99933" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656906 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-expirer" containerID="cri-o://ff7554df134a9b987aee98588a94d5b4291b3007d236dab11ef4a5269c5ec155" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.656947 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-updater" containerID="cri-o://1b1b7ad9b3b25fc4186459e5b20227c37302368c4c8864f045098981360f9f7d" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657009 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-auditor" containerID="cri-o://94847ce5f0acec892e5cd89129e447e396a43060107ba95f8c9b139186d9eb0c" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657047 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-replicator" containerID="cri-o://aaf90ce0ab9bffc7318f115d84669f3df2de828d8b5c3f705b46a2267c62ae7c" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657087 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-server" containerID="cri-o://0c7e5174d9f672056afce49e92e06209ed7d7d94d12faa9878b386f540aae91c" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657127 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-updater" containerID="cri-o://4e1ddecb424646252918e3a4fadf6abbc887ee68bff6e3eb9bd5e9554affb3ee" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657174 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-auditor" containerID="cri-o://28154e98042b7c18f39a657cd1e47647018cb477b9c733e3c3a7200e09660b22" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657210 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-replicator" containerID="cri-o://e90d8f1abbb3b1a24dde3c4a5692e4e3558f9fe6f7c6d524d7d4648442f6669f" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657245 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-server" containerID="cri-o://08dc9956f0db87fd55c19fef4d06f06377b555bf9915bea9307ac658a572a474" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657283 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-reaper" containerID="cri-o://bd11ab7de11a331b0134fa2038dd2fea81edc6e95ade077be08319319dd86742" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657329 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-auditor" containerID="cri-o://ba5071b51b0c0f395d04955d5cc3b833d94fb61752ff6aa3c2a98edc18ec75ea" gracePeriod=30 Mar 09 16:31:06 crc kubenswrapper[4831]: I0309 16:31:06.657368 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-replicator" containerID="cri-o://52c06dba774928e5339332493c207e503eee09c0c77395cab009b68d9dc446c8" gracePeriod=30 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.593321 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" event={"ID":"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b","Type":"ContainerStarted","Data":"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.593665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" event={"ID":"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b","Type":"ContainerStarted","Data":"9124dfa3c6e191eda8d572b259f8b928031c5beca3a83b0916c7bb50f2422960"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.593837 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" podUID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" containerName="swift-ring-rebalance" containerID="cri-o://aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca" gracePeriod=30 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614457 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="03ce249750b91f5a4458b3f2fcebe229ac19f428bff35a9577f87ce0c686fcb8" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614491 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="5bb292b424172609a8fdd92d252e1a716d5502e9e585df3b898f566dc4a99933" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614500 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="ff7554df134a9b987aee98588a94d5b4291b3007d236dab11ef4a5269c5ec155" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614507 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="1b1b7ad9b3b25fc4186459e5b20227c37302368c4c8864f045098981360f9f7d" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614515 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="94847ce5f0acec892e5cd89129e447e396a43060107ba95f8c9b139186d9eb0c" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614524 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="aaf90ce0ab9bffc7318f115d84669f3df2de828d8b5c3f705b46a2267c62ae7c" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614532 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="0c7e5174d9f672056afce49e92e06209ed7d7d94d12faa9878b386f540aae91c" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614539 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="4e1ddecb424646252918e3a4fadf6abbc887ee68bff6e3eb9bd5e9554affb3ee" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614545 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="28154e98042b7c18f39a657cd1e47647018cb477b9c733e3c3a7200e09660b22" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614552 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="e90d8f1abbb3b1a24dde3c4a5692e4e3558f9fe6f7c6d524d7d4648442f6669f" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614558 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="08dc9956f0db87fd55c19fef4d06f06377b555bf9915bea9307ac658a572a474" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614565 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="bd11ab7de11a331b0134fa2038dd2fea81edc6e95ade077be08319319dd86742" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614572 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="ba5071b51b0c0f395d04955d5cc3b833d94fb61752ff6aa3c2a98edc18ec75ea" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614580 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="52c06dba774928e5339332493c207e503eee09c0c77395cab009b68d9dc446c8" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614587 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="0d0615e21966787026795ed21aec5a8b9150ef6e98421a2dea83c8969206ad36" exitCode=0 Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614585 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"03ce249750b91f5a4458b3f2fcebe229ac19f428bff35a9577f87ce0c686fcb8"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614659 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"5bb292b424172609a8fdd92d252e1a716d5502e9e585df3b898f566dc4a99933"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"ff7554df134a9b987aee98588a94d5b4291b3007d236dab11ef4a5269c5ec155"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614697 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"1b1b7ad9b3b25fc4186459e5b20227c37302368c4c8864f045098981360f9f7d"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"94847ce5f0acec892e5cd89129e447e396a43060107ba95f8c9b139186d9eb0c"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"aaf90ce0ab9bffc7318f115d84669f3df2de828d8b5c3f705b46a2267c62ae7c"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"0c7e5174d9f672056afce49e92e06209ed7d7d94d12faa9878b386f540aae91c"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614765 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"4e1ddecb424646252918e3a4fadf6abbc887ee68bff6e3eb9bd5e9554affb3ee"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614781 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"28154e98042b7c18f39a657cd1e47647018cb477b9c733e3c3a7200e09660b22"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"e90d8f1abbb3b1a24dde3c4a5692e4e3558f9fe6f7c6d524d7d4648442f6669f"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"08dc9956f0db87fd55c19fef4d06f06377b555bf9915bea9307ac658a572a474"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614863 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"bd11ab7de11a331b0134fa2038dd2fea81edc6e95ade077be08319319dd86742"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614879 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"ba5071b51b0c0f395d04955d5cc3b833d94fb61752ff6aa3c2a98edc18ec75ea"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"52c06dba774928e5339332493c207e503eee09c0c77395cab009b68d9dc446c8"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.614930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"0d0615e21966787026795ed21aec5a8b9150ef6e98421a2dea83c8969206ad36"} Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.629826 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" podStartSLOduration=2.629804988 podStartE2EDuration="2.629804988s" podCreationTimestamp="2026-03-09 16:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:07.617012633 +0000 UTC m=+1994.750695096" watchObservedRunningTime="2026-03-09 16:31:07.629804988 +0000 UTC m=+1994.763487411" Mar 09 16:31:07 crc kubenswrapper[4831]: I0309 16:31:07.630786 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71fb0d7-b1fc-463d-ae1d-f53497e47a4f" path="/var/lib/kubelet/pods/f71fb0d7-b1fc-463d-ae1d-f53497e47a4f/volumes" Mar 09 16:31:12 crc kubenswrapper[4831]: I0309 16:31:12.985175 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:12 crc kubenswrapper[4831]: I0309 16:31:12.987756 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-zp7j5" Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.081033 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.081633 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-server" containerID="cri-o://1ef24f3552ff990790e620c5aad05a2418bc9b14dadf3dcb82fbade7475fb162" gracePeriod=30 Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.082162 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-httpd" containerID="cri-o://69e4cf1ca8fab537dfb3682664289a30d2b10b29537902236c2f2bd6d37b68cc" gracePeriod=30 Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.683929 4831 generic.go:334] "Generic (PLEG): container finished" podID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerID="1ef24f3552ff990790e620c5aad05a2418bc9b14dadf3dcb82fbade7475fb162" exitCode=0 Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.684334 4831 generic.go:334] "Generic (PLEG): container finished" podID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerID="69e4cf1ca8fab537dfb3682664289a30d2b10b29537902236c2f2bd6d37b68cc" exitCode=0 Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.684279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerDied","Data":"1ef24f3552ff990790e620c5aad05a2418bc9b14dadf3dcb82fbade7475fb162"} Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.684654 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerDied","Data":"69e4cf1ca8fab537dfb3682664289a30d2b10b29537902236c2f2bd6d37b68cc"} Mar 09 16:31:13 crc kubenswrapper[4831]: I0309 16:31:13.968377 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.142309 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift\") pod \"591a000d-158e-4105-98c1-0fd75c2aa00c\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.142513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd\") pod \"591a000d-158e-4105-98c1-0fd75c2aa00c\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.142854 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "591a000d-158e-4105-98c1-0fd75c2aa00c" (UID: "591a000d-158e-4105-98c1-0fd75c2aa00c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.143039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data\") pod \"591a000d-158e-4105-98c1-0fd75c2aa00c\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.143513 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "591a000d-158e-4105-98c1-0fd75c2aa00c" (UID: "591a000d-158e-4105-98c1-0fd75c2aa00c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.143676 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd\") pod \"591a000d-158e-4105-98c1-0fd75c2aa00c\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.143757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm8ww\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww\") pod \"591a000d-158e-4105-98c1-0fd75c2aa00c\" (UID: \"591a000d-158e-4105-98c1-0fd75c2aa00c\") " Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.144108 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.144133 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/591a000d-158e-4105-98c1-0fd75c2aa00c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.149085 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "591a000d-158e-4105-98c1-0fd75c2aa00c" (UID: "591a000d-158e-4105-98c1-0fd75c2aa00c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.150153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww" (OuterVolumeSpecName: "kube-api-access-nm8ww") pod "591a000d-158e-4105-98c1-0fd75c2aa00c" (UID: "591a000d-158e-4105-98c1-0fd75c2aa00c"). InnerVolumeSpecName "kube-api-access-nm8ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.199594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data" (OuterVolumeSpecName: "config-data") pod "591a000d-158e-4105-98c1-0fd75c2aa00c" (UID: "591a000d-158e-4105-98c1-0fd75c2aa00c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.245243 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591a000d-158e-4105-98c1-0fd75c2aa00c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.245291 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm8ww\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-kube-api-access-nm8ww\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.245307 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/591a000d-158e-4105-98c1-0fd75c2aa00c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.693646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" event={"ID":"591a000d-158e-4105-98c1-0fd75c2aa00c","Type":"ContainerDied","Data":"4f53de33b2091a2146fbff123e2a458b36f61bef7967d6dbc7d166fc95110721"} Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.693903 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.693998 4831 scope.go:117] "RemoveContainer" containerID="1ef24f3552ff990790e620c5aad05a2418bc9b14dadf3dcb82fbade7475fb162" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.764301 4831 scope.go:117] "RemoveContainer" containerID="69e4cf1ca8fab537dfb3682664289a30d2b10b29537902236c2f2bd6d37b68cc" Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.775207 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:31:14 crc kubenswrapper[4831]: I0309 16:31:14.785540 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-67d5466c69-m5wnw"] Mar 09 16:31:15 crc kubenswrapper[4831]: I0309 16:31:15.626888 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" path="/var/lib/kubelet/pods/591a000d-158e-4105-98c1-0fd75c2aa00c/volumes" Mar 09 16:31:15 crc kubenswrapper[4831]: I0309 16:31:15.702863 4831 generic.go:334] "Generic (PLEG): container finished" podID="b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" containerID="9fa8511fb5d02d9dbaa4b2695ab278f00fceb6246a341e62280fa237a70f21ce" exitCode=0 Mar 09 16:31:15 crc kubenswrapper[4831]: I0309 16:31:15.702910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" event={"ID":"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1","Type":"ContainerDied","Data":"9fa8511fb5d02d9dbaa4b2695ab278f00fceb6246a341e62280fa237a70f21ce"} Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.000236 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096342 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096410 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9kf\" (UniqueName: \"kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096506 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.096543 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift\") pod \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\" (UID: \"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1\") " Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.097224 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.097433 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.097547 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.104737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf" (OuterVolumeSpecName: "kube-api-access-4c9kf") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "kube-api-access-4c9kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.115742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts" (OuterVolumeSpecName: "scripts") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.119410 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.123871 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" (UID: "b4aa9c32-5e03-4e35-b545-2d6a820ebcb1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.199469 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9kf\" (UniqueName: \"kubernetes.io/projected/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-kube-api-access-4c9kf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.199510 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.199520 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.199530 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.199538 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4aa9c32-5e03-4e35-b545-2d6a820ebcb1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.721306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" event={"ID":"b4aa9c32-5e03-4e35-b545-2d6a820ebcb1","Type":"ContainerDied","Data":"333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e"} Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.721363 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333bc1aaaf058a25f9d50788ff136a359c5855bfaf691d06d04ca0c4e37e8b4e" Mar 09 16:31:17 crc kubenswrapper[4831]: I0309 16:31:17.721442 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-sltr4" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.668373 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773641 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773749 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773826 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.773941 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rrzh\" (UniqueName: \"kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh\") pod \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\" (UID: \"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b\") " Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.774441 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.774932 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.775127 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.779265 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh" (OuterVolumeSpecName: "kube-api-access-4rrzh") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "kube-api-access-4rrzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.793988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.799953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.802874 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" containerID="aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca" exitCode=1 Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.802917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" event={"ID":"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b","Type":"ContainerDied","Data":"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca"} Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.802949 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" event={"ID":"b8e3adb4-6684-4a23-aaa4-f5f3271aff9b","Type":"ContainerDied","Data":"9124dfa3c6e191eda8d572b259f8b928031c5beca3a83b0916c7bb50f2422960"} Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.802969 4831 scope.go:117] "RemoveContainer" containerID="aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.802962 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7lccs" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.808663 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts" (OuterVolumeSpecName: "scripts") pod "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" (UID: "b8e3adb4-6684-4a23-aaa4-f5f3271aff9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.849117 4831 scope.go:117] "RemoveContainer" containerID="aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca" Mar 09 16:31:26 crc kubenswrapper[4831]: E0309 16:31:26.855414 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca\": container with ID starting with aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca not found: ID does not exist" containerID="aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.855473 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca"} err="failed to get container status \"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca\": rpc error: code = NotFound desc = could not find container \"aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca\": container with ID starting with aa96985f93644b1c23326a39d3e3254b7a7000a0f7a6e072253ed87077f43eca not found: ID does not exist" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.875506 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.875548 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.875562 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.875574 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.875587 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rrzh\" (UniqueName: \"kubernetes.io/projected/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b-kube-api-access-4rrzh\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.937892 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:26 crc kubenswrapper[4831]: I0309 16:31:26.944936 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7lccs"] Mar 09 16:31:27 crc kubenswrapper[4831]: I0309 16:31:27.630751 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" path="/var/lib/kubelet/pods/b8e3adb4-6684-4a23-aaa4-f5f3271aff9b/volumes" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.147491 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7k785"] Mar 09 16:31:28 crc kubenswrapper[4831]: E0309 16:31:28.148173 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148212 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: E0309 16:31:28.148229 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-httpd" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-httpd" Mar 09 16:31:28 crc kubenswrapper[4831]: E0309 16:31:28.148254 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-server" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148261 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-server" Mar 09 16:31:28 crc kubenswrapper[4831]: E0309 16:31:28.148283 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148291 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148507 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4aa9c32-5e03-4e35-b545-2d6a820ebcb1" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148530 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-httpd" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148546 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e3adb4-6684-4a23-aaa4-f5f3271aff9b" containerName="swift-ring-rebalance" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.148565 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="591a000d-158e-4105-98c1-0fd75c2aa00c" containerName="proxy-server" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.149103 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.152184 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.153057 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.175131 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7k785"] Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.295794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.296086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.296208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.296254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.296320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.297106 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sxg5\" (UniqueName: \"kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399018 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399156 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sxg5\" (UniqueName: \"kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399315 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.399347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.400145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.400323 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.400626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.408323 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.408355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.417936 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sxg5\" (UniqueName: \"kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5\") pod \"swift-ring-rebalance-debug-7k785\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.471103 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:28 crc kubenswrapper[4831]: I0309 16:31:28.906302 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7k785"] Mar 09 16:31:28 crc kubenswrapper[4831]: W0309 16:31:28.917141 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod571ac593_4c25_4a4a_9d47_7c37dfce2b54.slice/crio-7ffabf176dfcbda9768256b8d12087a3613ed420eb069ad8bd3e78f11bc6c1ec WatchSource:0}: Error finding container 7ffabf176dfcbda9768256b8d12087a3613ed420eb069ad8bd3e78f11bc6c1ec: Status 404 returned error can't find the container with id 7ffabf176dfcbda9768256b8d12087a3613ed420eb069ad8bd3e78f11bc6c1ec Mar 09 16:31:29 crc kubenswrapper[4831]: I0309 16:31:29.864473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" event={"ID":"571ac593-4c25-4a4a-9d47-7c37dfce2b54","Type":"ContainerStarted","Data":"652886be26b14f2029241f544dec471acf6e2627541a8cc9ee6ae1821b3662b8"} Mar 09 16:31:29 crc kubenswrapper[4831]: I0309 16:31:29.864802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" event={"ID":"571ac593-4c25-4a4a-9d47-7c37dfce2b54","Type":"ContainerStarted","Data":"7ffabf176dfcbda9768256b8d12087a3613ed420eb069ad8bd3e78f11bc6c1ec"} Mar 09 16:31:29 crc kubenswrapper[4831]: I0309 16:31:29.889354 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" podStartSLOduration=1.889333087 podStartE2EDuration="1.889333087s" podCreationTimestamp="2026-03-09 16:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:29.882210055 +0000 UTC m=+2017.015892488" watchObservedRunningTime="2026-03-09 16:31:29.889333087 +0000 UTC m=+2017.023015510" Mar 09 16:31:36 crc kubenswrapper[4831]: I0309 16:31:36.946424 4831 generic.go:334] "Generic (PLEG): container finished" podID="11a56635-2024-4f89-98be-207e7a1176fe" containerID="cd92eef9b48e2fbc409ddacc00af540edf1d62d35e807b6d8fbd4de17e75fbcf" exitCode=137 Mar 09 16:31:36 crc kubenswrapper[4831]: I0309 16:31:36.946967 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"cd92eef9b48e2fbc409ddacc00af540edf1d62d35e807b6d8fbd4de17e75fbcf"} Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.041307 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.142583 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7r6\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6\") pod \"11a56635-2024-4f89-98be-207e7a1176fe\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.142654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock\") pod \"11a56635-2024-4f89-98be-207e7a1176fe\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.142674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") pod \"11a56635-2024-4f89-98be-207e7a1176fe\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.142783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache\") pod \"11a56635-2024-4f89-98be-207e7a1176fe\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.142808 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"11a56635-2024-4f89-98be-207e7a1176fe\" (UID: \"11a56635-2024-4f89-98be-207e7a1176fe\") " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.143165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock" (OuterVolumeSpecName: "lock") pod "11a56635-2024-4f89-98be-207e7a1176fe" (UID: "11a56635-2024-4f89-98be-207e7a1176fe"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.143340 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache" (OuterVolumeSpecName: "cache") pod "11a56635-2024-4f89-98be-207e7a1176fe" (UID: "11a56635-2024-4f89-98be-207e7a1176fe"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.147912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "11a56635-2024-4f89-98be-207e7a1176fe" (UID: "11a56635-2024-4f89-98be-207e7a1176fe"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.149354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6" (OuterVolumeSpecName: "kube-api-access-bn7r6") pod "11a56635-2024-4f89-98be-207e7a1176fe" (UID: "11a56635-2024-4f89-98be-207e7a1176fe"). InnerVolumeSpecName "kube-api-access-bn7r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.154089 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "swift") pod "11a56635-2024-4f89-98be-207e7a1176fe" (UID: "11a56635-2024-4f89-98be-207e7a1176fe"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.244911 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7r6\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-kube-api-access-bn7r6\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.244957 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11a56635-2024-4f89-98be-207e7a1176fe-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.244971 4831 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-lock\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.244981 4831 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11a56635-2024-4f89-98be-207e7a1176fe-cache\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.245055 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.264669 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.346946 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.959168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"11a56635-2024-4f89-98be-207e7a1176fe","Type":"ContainerDied","Data":"c1cb9eff57a84c22211bbd32b81d7e8e1519c4ddccd6e902928bb74c734e4398"} Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.959498 4831 scope.go:117] "RemoveContainer" containerID="03ce249750b91f5a4458b3f2fcebe229ac19f428bff35a9577f87ce0c686fcb8" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.959269 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.987463 4831 scope.go:117] "RemoveContainer" containerID="cd92eef9b48e2fbc409ddacc00af540edf1d62d35e807b6d8fbd4de17e75fbcf" Mar 09 16:31:37 crc kubenswrapper[4831]: I0309 16:31:37.988991 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.000953 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.009270 4831 scope.go:117] "RemoveContainer" containerID="5bb292b424172609a8fdd92d252e1a716d5502e9e585df3b898f566dc4a99933" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.029835 4831 scope.go:117] "RemoveContainer" containerID="ff7554df134a9b987aee98588a94d5b4291b3007d236dab11ef4a5269c5ec155" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.031758 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032116 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032126 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="swift-recon-cron" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032135 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="swift-recon-cron" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032155 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-sharder" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032167 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-sharder" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="rsync" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032191 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="rsync" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032202 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-expirer" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032214 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-expirer" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032234 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032242 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032251 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032258 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-server" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032269 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032276 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032290 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032336 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-server" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.032350 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.032359 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039521 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039539 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039547 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-server" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039558 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039565 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039589 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039596 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039608 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039614 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: E0309 16:31:38.039623 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-reaper" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039632 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-reaper" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039878 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039895 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039902 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039915 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="swift-recon-cron" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039924 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039932 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-updater" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039939 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039950 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-replicator" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039960 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039966 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039975 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="rsync" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039985 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-server" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.039995 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-expirer" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.040003 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="account-reaper" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.040010 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="container-sharder" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.040019 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a56635-2024-4f89-98be-207e7a1176fe" containerName="object-auditor" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.054311 4831 scope.go:117] "RemoveContainer" containerID="1b1b7ad9b3b25fc4186459e5b20227c37302368c4c8864f045098981360f9f7d" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.054730 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.079964 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.082642 4831 scope.go:117] "RemoveContainer" containerID="94847ce5f0acec892e5cd89129e447e396a43060107ba95f8c9b139186d9eb0c" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.100858 4831 scope.go:117] "RemoveContainer" containerID="aaf90ce0ab9bffc7318f115d84669f3df2de828d8b5c3f705b46a2267c62ae7c" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.119196 4831 scope.go:117] "RemoveContainer" containerID="0c7e5174d9f672056afce49e92e06209ed7d7d94d12faa9878b386f540aae91c" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.133881 4831 scope.go:117] "RemoveContainer" containerID="4e1ddecb424646252918e3a4fadf6abbc887ee68bff6e3eb9bd5e9554affb3ee" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.149103 4831 scope.go:117] "RemoveContainer" containerID="28154e98042b7c18f39a657cd1e47647018cb477b9c733e3c3a7200e09660b22" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.162540 4831 scope.go:117] "RemoveContainer" containerID="e90d8f1abbb3b1a24dde3c4a5692e4e3558f9fe6f7c6d524d7d4648442f6669f" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.173199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.173257 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8djr7\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-kube-api-access-8djr7\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.173284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-lock\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.173303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-etc-swift\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.173348 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-cache\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.179744 4831 scope.go:117] "RemoveContainer" containerID="08dc9956f0db87fd55c19fef4d06f06377b555bf9915bea9307ac658a572a474" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.196629 4831 scope.go:117] "RemoveContainer" containerID="bd11ab7de11a331b0134fa2038dd2fea81edc6e95ade077be08319319dd86742" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.210848 4831 scope.go:117] "RemoveContainer" containerID="ba5071b51b0c0f395d04955d5cc3b833d94fb61752ff6aa3c2a98edc18ec75ea" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.226249 4831 scope.go:117] "RemoveContainer" containerID="52c06dba774928e5339332493c207e503eee09c0c77395cab009b68d9dc446c8" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.239351 4831 scope.go:117] "RemoveContainer" containerID="0d0615e21966787026795ed21aec5a8b9150ef6e98421a2dea83c8969206ad36" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274557 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8djr7\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-kube-api-access-8djr7\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-lock\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-etc-swift\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274743 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-cache\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.274939 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.275281 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-cache\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.275309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/065611bd-2987-4efb-b743-8045f7ec18fc-lock\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.284689 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-etc-swift\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.289446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8djr7\" (UniqueName: \"kubernetes.io/projected/065611bd-2987-4efb-b743-8045f7ec18fc-kube-api-access-8djr7\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.293414 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"065611bd-2987-4efb-b743-8045f7ec18fc\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.381673 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.848497 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 16:31:38 crc kubenswrapper[4831]: W0309 16:31:38.860140 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod065611bd_2987_4efb_b743_8045f7ec18fc.slice/crio-5f8c243a0c5b0813e00e6e3fec58e557f418870fcb3eed23e565cd0655d04cf2 WatchSource:0}: Error finding container 5f8c243a0c5b0813e00e6e3fec58e557f418870fcb3eed23e565cd0655d04cf2: Status 404 returned error can't find the container with id 5f8c243a0c5b0813e00e6e3fec58e557f418870fcb3eed23e565cd0655d04cf2 Mar 09 16:31:38 crc kubenswrapper[4831]: I0309 16:31:38.969465 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"5f8c243a0c5b0813e00e6e3fec58e557f418870fcb3eed23e565cd0655d04cf2"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.651249 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a56635-2024-4f89-98be-207e7a1176fe" path="/var/lib/kubelet/pods/11a56635-2024-4f89-98be-207e7a1176fe/volumes" Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"789dddbdae4ab5e71d434b7bd3367e52c5a5b991f0bfa735a02bc22ea92f9a47"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"ad4e5620c630ad2bd32dae272a5e11575b741b168fd2dbc8febf7682ec377b50"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983743 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"2e3ed71667b6093a0745a80ba632f99a9d19dd21691508286629854cdc47532e"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983751 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"0502999c5b169586ae0b7163b26dca27a8975856bb35a0ad8a9f14d2663ca513"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983759 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"4f27109692ffeeef040d2f69a0c7ff66582c5e4cbecbfa810a02e39c82a4c24a"} Mar 09 16:31:39 crc kubenswrapper[4831]: I0309 16:31:39.983768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"d40b406a034acd8178dccc89078043ff17dcd8a619a2453e2ed4a884a5908db8"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:40.999851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"23cfc7daa40168e24bdfe8e69bb38583f3d8ef3ada62ceb41b5855a9adcf4ffa"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"8f98acd74a03a8a683ebfd19fc7303b77254a2f05206bc8f098830e00a33e85f"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000393 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"0f20e4671a5912a48ab2590f65b4c0fce669acc5e80bf90d8f49e0582c264238"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000421 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"b3bda51f4a4cb8c183d65a014967bff3231e4d5fced4c5d931ffe86f545513ac"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000461 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"a739e41e82baf808e121fd2fb34d63b1ea07cc917d2d2f6211400fb5768a5486"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"99fd4221c38889b6c6eecf5529a37c8a470f80359fea334346df8201a7a317a6"} Mar 09 16:31:41 crc kubenswrapper[4831]: I0309 16:31:41.000480 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"a5a1538f58d64bbea62f519b9dca05cb1749086057f97dd227542e1c0c198757"} Mar 09 16:31:42 crc kubenswrapper[4831]: I0309 16:31:42.013723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"0f432b4cfa15ca59a7bbd9a7fea60437351a8c9451dca39117c239fb16f75986"} Mar 09 16:31:42 crc kubenswrapper[4831]: I0309 16:31:42.014083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"065611bd-2987-4efb-b743-8045f7ec18fc","Type":"ContainerStarted","Data":"0d675ee4143285786dc53b84e9abb3ed4ef06f90a02ad0a02bdd2302f37c0d41"} Mar 09 16:31:42 crc kubenswrapper[4831]: I0309 16:31:42.059213 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=4.059190293 podStartE2EDuration="4.059190293s" podCreationTimestamp="2026-03-09 16:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:42.046466252 +0000 UTC m=+2029.180148685" watchObservedRunningTime="2026-03-09 16:31:42.059190293 +0000 UTC m=+2029.192872716" Mar 09 16:31:48 crc kubenswrapper[4831]: I0309 16:31:48.058275 4831 generic.go:334] "Generic (PLEG): container finished" podID="571ac593-4c25-4a4a-9d47-7c37dfce2b54" containerID="652886be26b14f2029241f544dec471acf6e2627541a8cc9ee6ae1821b3662b8" exitCode=0 Mar 09 16:31:48 crc kubenswrapper[4831]: I0309 16:31:48.058360 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" event={"ID":"571ac593-4c25-4a4a-9d47-7c37dfce2b54","Type":"ContainerDied","Data":"652886be26b14f2029241f544dec471acf6e2627541a8cc9ee6ae1821b3662b8"} Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.334648 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.367330 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7k785"] Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.374883 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7k785"] Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.451801 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.451914 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.451972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sxg5\" (UniqueName: \"kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.452024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.452059 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.452089 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf\") pod \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\" (UID: \"571ac593-4c25-4a4a-9d47-7c37dfce2b54\") " Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.464696 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.464944 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.467724 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5" (OuterVolumeSpecName: "kube-api-access-8sxg5") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "kube-api-access-8sxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.477159 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.483764 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.485459 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts" (OuterVolumeSpecName: "scripts") pod "571ac593-4c25-4a4a-9d47-7c37dfce2b54" (UID: "571ac593-4c25-4a4a-9d47-7c37dfce2b54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554528 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ac593-4c25-4a4a-9d47-7c37dfce2b54-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554565 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554574 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sxg5\" (UniqueName: \"kubernetes.io/projected/571ac593-4c25-4a4a-9d47-7c37dfce2b54-kube-api-access-8sxg5\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554584 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554593 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ac593-4c25-4a4a-9d47-7c37dfce2b54-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.554601 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ac593-4c25-4a4a-9d47-7c37dfce2b54-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.626650 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571ac593-4c25-4a4a-9d47-7c37dfce2b54" path="/var/lib/kubelet/pods/571ac593-4c25-4a4a-9d47-7c37dfce2b54/volumes" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.731175 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2"] Mar 09 16:31:49 crc kubenswrapper[4831]: E0309 16:31:49.731759 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571ac593-4c25-4a4a-9d47-7c37dfce2b54" containerName="swift-ring-rebalance" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.731793 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="571ac593-4c25-4a4a-9d47-7c37dfce2b54" containerName="swift-ring-rebalance" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.732038 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="571ac593-4c25-4a4a-9d47-7c37dfce2b54" containerName="swift-ring-rebalance" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.732870 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.736754 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2"] Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jrl\" (UniqueName: \"kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860453 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860478 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860625 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.860765 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.961970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962034 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962247 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jrl\" (UniqueName: \"kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962283 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.962845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.963686 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.964343 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.969582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.969889 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:49 crc kubenswrapper[4831]: I0309 16:31:49.986283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jrl\" (UniqueName: \"kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl\") pod \"swift-ring-rebalance-debug-4c2k2\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:50 crc kubenswrapper[4831]: I0309 16:31:50.074625 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:50 crc kubenswrapper[4831]: I0309 16:31:50.077583 4831 scope.go:117] "RemoveContainer" containerID="652886be26b14f2029241f544dec471acf6e2627541a8cc9ee6ae1821b3662b8" Mar 09 16:31:50 crc kubenswrapper[4831]: I0309 16:31:50.077627 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7k785" Mar 09 16:31:50 crc kubenswrapper[4831]: I0309 16:31:50.567917 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2"] Mar 09 16:31:51 crc kubenswrapper[4831]: I0309 16:31:51.089473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" event={"ID":"29d1d1a7-73fe-418a-83e8-eb52b4001062","Type":"ContainerStarted","Data":"28f3b1761ef562d29a04dc6a98ae5e100b6fc6e04573123133e748cbb15c1f8f"} Mar 09 16:31:51 crc kubenswrapper[4831]: I0309 16:31:51.089524 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" event={"ID":"29d1d1a7-73fe-418a-83e8-eb52b4001062","Type":"ContainerStarted","Data":"304c6572b427b026e0f649714d664b036e50e7965a492a071567698a6b9fb689"} Mar 09 16:31:51 crc kubenswrapper[4831]: I0309 16:31:51.110464 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" podStartSLOduration=2.110442538 podStartE2EDuration="2.110442538s" podCreationTimestamp="2026-03-09 16:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:51.104227412 +0000 UTC m=+2038.237909845" watchObservedRunningTime="2026-03-09 16:31:51.110442538 +0000 UTC m=+2038.244124971" Mar 09 16:31:52 crc kubenswrapper[4831]: I0309 16:31:52.095801 4831 generic.go:334] "Generic (PLEG): container finished" podID="29d1d1a7-73fe-418a-83e8-eb52b4001062" containerID="28f3b1761ef562d29a04dc6a98ae5e100b6fc6e04573123133e748cbb15c1f8f" exitCode=0 Mar 09 16:31:52 crc kubenswrapper[4831]: I0309 16:31:52.095884 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" event={"ID":"29d1d1a7-73fe-418a-83e8-eb52b4001062","Type":"ContainerDied","Data":"28f3b1761ef562d29a04dc6a98ae5e100b6fc6e04573123133e748cbb15c1f8f"} Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.363679 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.396342 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2"] Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.403899 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2"] Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514246 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514316 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514469 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.514489 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8jrl\" (UniqueName: \"kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl\") pod \"29d1d1a7-73fe-418a-83e8-eb52b4001062\" (UID: \"29d1d1a7-73fe-418a-83e8-eb52b4001062\") " Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.515044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.515132 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.524128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl" (OuterVolumeSpecName: "kube-api-access-d8jrl") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "kube-api-access-d8jrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.539196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts" (OuterVolumeSpecName: "scripts") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.542686 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.543474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "29d1d1a7-73fe-418a-83e8-eb52b4001062" (UID: "29d1d1a7-73fe-418a-83e8-eb52b4001062"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616283 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616318 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d1d1a7-73fe-418a-83e8-eb52b4001062-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616329 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616342 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8jrl\" (UniqueName: \"kubernetes.io/projected/29d1d1a7-73fe-418a-83e8-eb52b4001062-kube-api-access-d8jrl\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616352 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d1d1a7-73fe-418a-83e8-eb52b4001062-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.616361 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d1d1a7-73fe-418a-83e8-eb52b4001062-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:53 crc kubenswrapper[4831]: I0309 16:31:53.633969 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d1d1a7-73fe-418a-83e8-eb52b4001062" path="/var/lib/kubelet/pods/29d1d1a7-73fe-418a-83e8-eb52b4001062/volumes" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.115286 4831 scope.go:117] "RemoveContainer" containerID="28f3b1761ef562d29a04dc6a98ae5e100b6fc6e04573123133e748cbb15c1f8f" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.115358 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4c2k2" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.554893 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp"] Mar 09 16:31:54 crc kubenswrapper[4831]: E0309 16:31:54.555244 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d1d1a7-73fe-418a-83e8-eb52b4001062" containerName="swift-ring-rebalance" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.555260 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d1d1a7-73fe-418a-83e8-eb52b4001062" containerName="swift-ring-rebalance" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.555451 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d1d1a7-73fe-418a-83e8-eb52b4001062" containerName="swift-ring-rebalance" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.555998 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.561356 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.561355 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.567724 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp"] Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.732781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.733104 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hsl8\" (UniqueName: \"kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.733274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.733427 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.733520 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.733837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835363 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835386 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hsl8\" (UniqueName: \"kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.835992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.836720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.836875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.839921 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.840681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.853129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hsl8\" (UniqueName: \"kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8\") pod \"swift-ring-rebalance-debug-g2ctp\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:54 crc kubenswrapper[4831]: I0309 16:31:54.873746 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:55 crc kubenswrapper[4831]: I0309 16:31:55.274250 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp"] Mar 09 16:31:55 crc kubenswrapper[4831]: W0309 16:31:55.283836 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc11a9de1_1340_4854_9972_a0baa8437206.slice/crio-547c04ca1bbe0829d1a17f31203c9278164a4a04dfb20e723d0a76500e66d033 WatchSource:0}: Error finding container 547c04ca1bbe0829d1a17f31203c9278164a4a04dfb20e723d0a76500e66d033: Status 404 returned error can't find the container with id 547c04ca1bbe0829d1a17f31203c9278164a4a04dfb20e723d0a76500e66d033 Mar 09 16:31:56 crc kubenswrapper[4831]: I0309 16:31:56.142924 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" event={"ID":"c11a9de1-1340-4854-9972-a0baa8437206","Type":"ContainerStarted","Data":"00e1ece29cdda227161271a4e4c5992cc396e33ea6aeda03b68ad6615faf2941"} Mar 09 16:31:56 crc kubenswrapper[4831]: I0309 16:31:56.143190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" event={"ID":"c11a9de1-1340-4854-9972-a0baa8437206","Type":"ContainerStarted","Data":"547c04ca1bbe0829d1a17f31203c9278164a4a04dfb20e723d0a76500e66d033"} Mar 09 16:31:56 crc kubenswrapper[4831]: I0309 16:31:56.170063 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" podStartSLOduration=2.170045156 podStartE2EDuration="2.170045156s" podCreationTimestamp="2026-03-09 16:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:31:56.167778592 +0000 UTC m=+2043.301461015" watchObservedRunningTime="2026-03-09 16:31:56.170045156 +0000 UTC m=+2043.303727579" Mar 09 16:31:57 crc kubenswrapper[4831]: I0309 16:31:57.158078 4831 generic.go:334] "Generic (PLEG): container finished" podID="c11a9de1-1340-4854-9972-a0baa8437206" containerID="00e1ece29cdda227161271a4e4c5992cc396e33ea6aeda03b68ad6615faf2941" exitCode=0 Mar 09 16:31:57 crc kubenswrapper[4831]: I0309 16:31:57.158205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" event={"ID":"c11a9de1-1340-4854-9972-a0baa8437206","Type":"ContainerDied","Data":"00e1ece29cdda227161271a4e4c5992cc396e33ea6aeda03b68ad6615faf2941"} Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.442012 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.474415 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp"] Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.481557 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp"] Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.585992 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.586129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.586177 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.586222 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.586240 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.586292 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hsl8\" (UniqueName: \"kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8\") pod \"c11a9de1-1340-4854-9972-a0baa8437206\" (UID: \"c11a9de1-1340-4854-9972-a0baa8437206\") " Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.587290 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.587822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.599634 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8" (OuterVolumeSpecName: "kube-api-access-7hsl8") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "kube-api-access-7hsl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.611803 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts" (OuterVolumeSpecName: "scripts") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.612128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.621239 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c11a9de1-1340-4854-9972-a0baa8437206" (UID: "c11a9de1-1340-4854-9972-a0baa8437206"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.688655 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.688899 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11a9de1-1340-4854-9972-a0baa8437206-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.688962 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11a9de1-1340-4854-9972-a0baa8437206-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.689051 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.689137 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hsl8\" (UniqueName: \"kubernetes.io/projected/c11a9de1-1340-4854-9972-a0baa8437206-kube-api-access-7hsl8\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:58 crc kubenswrapper[4831]: I0309 16:31:58.689211 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11a9de1-1340-4854-9972-a0baa8437206-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.191473 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="547c04ca1bbe0829d1a17f31203c9278164a4a04dfb20e723d0a76500e66d033" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.191594 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-g2ctp" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.627220 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11a9de1-1340-4854-9972-a0baa8437206" path="/var/lib/kubelet/pods/c11a9de1-1340-4854-9972-a0baa8437206/volumes" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.628032 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2"] Mar 09 16:31:59 crc kubenswrapper[4831]: E0309 16:31:59.628243 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11a9de1-1340-4854-9972-a0baa8437206" containerName="swift-ring-rebalance" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.628254 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11a9de1-1340-4854-9972-a0baa8437206" containerName="swift-ring-rebalance" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.628432 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11a9de1-1340-4854-9972-a0baa8437206" containerName="swift-ring-rebalance" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.628862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.630373 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.632852 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.649208 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2"] Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.809849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcxk\" (UniqueName: \"kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.809901 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.809947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.810100 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.810145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.810176 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.911873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.911919 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.911941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.912011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flcxk\" (UniqueName: \"kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.912056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.912112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.913069 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.913315 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.913750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.916033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.916345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.934342 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flcxk\" (UniqueName: \"kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk\") pod \"swift-ring-rebalance-debug-8k4d2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:31:59 crc kubenswrapper[4831]: I0309 16:31:59.945566 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.130581 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551232-d4cc9"] Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.131828 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.138886 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.139257 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.139355 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.142665 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551232-d4cc9"] Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.218371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb6v\" (UniqueName: \"kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v\") pod \"auto-csr-approver-29551232-d4cc9\" (UID: \"cd99a0c8-1885-451a-adfd-9b493fbf5e7f\") " pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.320308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmb6v\" (UniqueName: \"kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v\") pod \"auto-csr-approver-29551232-d4cc9\" (UID: \"cd99a0c8-1885-451a-adfd-9b493fbf5e7f\") " pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.339530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmb6v\" (UniqueName: \"kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v\") pod \"auto-csr-approver-29551232-d4cc9\" (UID: \"cd99a0c8-1885-451a-adfd-9b493fbf5e7f\") " pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:00 crc kubenswrapper[4831]: W0309 16:32:00.372930 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbdd7a8c_6647_4c8d_b888_94f715861cb2.slice/crio-e37642a3a41a01ee3a6f5e37ac636f3b98ccb3905926343276cb31e989e36133 WatchSource:0}: Error finding container e37642a3a41a01ee3a6f5e37ac636f3b98ccb3905926343276cb31e989e36133: Status 404 returned error can't find the container with id e37642a3a41a01ee3a6f5e37ac636f3b98ccb3905926343276cb31e989e36133 Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.374984 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2"] Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.459746 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:00 crc kubenswrapper[4831]: I0309 16:32:00.916638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551232-d4cc9"] Mar 09 16:32:01 crc kubenswrapper[4831]: I0309 16:32:01.206771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" event={"ID":"cd99a0c8-1885-451a-adfd-9b493fbf5e7f","Type":"ContainerStarted","Data":"c01ed81e0fd828d00dc098f908dd10d60ab708feec024a7f43af7e318ed67a13"} Mar 09 16:32:01 crc kubenswrapper[4831]: I0309 16:32:01.207987 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" event={"ID":"dbdd7a8c-6647-4c8d-b888-94f715861cb2","Type":"ContainerStarted","Data":"2b24f8818ced0470e5f5e0858d95cb2e52ac6309538701f17fd5b45fa041ebb4"} Mar 09 16:32:01 crc kubenswrapper[4831]: I0309 16:32:01.208030 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" event={"ID":"dbdd7a8c-6647-4c8d-b888-94f715861cb2","Type":"ContainerStarted","Data":"e37642a3a41a01ee3a6f5e37ac636f3b98ccb3905926343276cb31e989e36133"} Mar 09 16:32:01 crc kubenswrapper[4831]: I0309 16:32:01.231134 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" podStartSLOduration=2.231111314 podStartE2EDuration="2.231111314s" podCreationTimestamp="2026-03-09 16:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:01.223649522 +0000 UTC m=+2048.357331945" watchObservedRunningTime="2026-03-09 16:32:01.231111314 +0000 UTC m=+2048.364793737" Mar 09 16:32:02 crc kubenswrapper[4831]: I0309 16:32:02.216296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" event={"ID":"cd99a0c8-1885-451a-adfd-9b493fbf5e7f","Type":"ContainerStarted","Data":"986737dea23a18e273b6264307453a8fb79870e7569b0330771533f5a3383a4a"} Mar 09 16:32:02 crc kubenswrapper[4831]: I0309 16:32:02.217842 4831 generic.go:334] "Generic (PLEG): container finished" podID="dbdd7a8c-6647-4c8d-b888-94f715861cb2" containerID="2b24f8818ced0470e5f5e0858d95cb2e52ac6309538701f17fd5b45fa041ebb4" exitCode=0 Mar 09 16:32:02 crc kubenswrapper[4831]: I0309 16:32:02.217894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" event={"ID":"dbdd7a8c-6647-4c8d-b888-94f715861cb2","Type":"ContainerDied","Data":"2b24f8818ced0470e5f5e0858d95cb2e52ac6309538701f17fd5b45fa041ebb4"} Mar 09 16:32:02 crc kubenswrapper[4831]: I0309 16:32:02.232781 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" podStartSLOduration=1.264740908 podStartE2EDuration="2.232758192s" podCreationTimestamp="2026-03-09 16:32:00 +0000 UTC" firstStartedPulling="2026-03-09 16:32:00.932434944 +0000 UTC m=+2048.066117407" lastFinishedPulling="2026-03-09 16:32:01.900452268 +0000 UTC m=+2049.034134691" observedRunningTime="2026-03-09 16:32:02.232635999 +0000 UTC m=+2049.366318442" watchObservedRunningTime="2026-03-09 16:32:02.232758192 +0000 UTC m=+2049.366440625" Mar 09 16:32:02 crc kubenswrapper[4831]: I0309 16:32:02.427064 4831 scope.go:117] "RemoveContainer" containerID="41d421601f49d2f213d5cefdb7ce9ad6982cc9eb06b4103d3fc4f2aac1be507a" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.231587 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd99a0c8-1885-451a-adfd-9b493fbf5e7f" containerID="986737dea23a18e273b6264307453a8fb79870e7569b0330771533f5a3383a4a" exitCode=0 Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.231731 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" event={"ID":"cd99a0c8-1885-451a-adfd-9b493fbf5e7f","Type":"ContainerDied","Data":"986737dea23a18e273b6264307453a8fb79870e7569b0330771533f5a3383a4a"} Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.527913 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.557540 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2"] Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.564253 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2"] Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.574818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.575233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.575521 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flcxk\" (UniqueName: \"kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.575658 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.576018 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.576154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts\") pod \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\" (UID: \"dbdd7a8c-6647-4c8d-b888-94f715861cb2\") " Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.575810 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.576372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.576843 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.576935 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbdd7a8c-6647-4c8d-b888-94f715861cb2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.582274 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk" (OuterVolumeSpecName: "kube-api-access-flcxk") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "kube-api-access-flcxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.598228 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.598744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts" (OuterVolumeSpecName: "scripts") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.604736 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dbdd7a8c-6647-4c8d-b888-94f715861cb2" (UID: "dbdd7a8c-6647-4c8d-b888-94f715861cb2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.634538 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdd7a8c-6647-4c8d-b888-94f715861cb2" path="/var/lib/kubelet/pods/dbdd7a8c-6647-4c8d-b888-94f715861cb2/volumes" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.678956 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.679259 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flcxk\" (UniqueName: \"kubernetes.io/projected/dbdd7a8c-6647-4c8d-b888-94f715861cb2-kube-api-access-flcxk\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.679343 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbdd7a8c-6647-4c8d-b888-94f715861cb2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:03 crc kubenswrapper[4831]: I0309 16:32:03.679519 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbdd7a8c-6647-4c8d-b888-94f715861cb2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.240997 4831 scope.go:117] "RemoveContainer" containerID="2b24f8818ced0470e5f5e0858d95cb2e52ac6309538701f17fd5b45fa041ebb4" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.241060 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8k4d2" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.543475 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.593294 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmb6v\" (UniqueName: \"kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v\") pod \"cd99a0c8-1885-451a-adfd-9b493fbf5e7f\" (UID: \"cd99a0c8-1885-451a-adfd-9b493fbf5e7f\") " Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.598830 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v" (OuterVolumeSpecName: "kube-api-access-tmb6v") pod "cd99a0c8-1885-451a-adfd-9b493fbf5e7f" (UID: "cd99a0c8-1885-451a-adfd-9b493fbf5e7f"). InnerVolumeSpecName "kube-api-access-tmb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.694926 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmb6v\" (UniqueName: \"kubernetes.io/projected/cd99a0c8-1885-451a-adfd-9b493fbf5e7f-kube-api-access-tmb6v\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.695860 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cr67d"] Mar 09 16:32:04 crc kubenswrapper[4831]: E0309 16:32:04.696210 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdd7a8c-6647-4c8d-b888-94f715861cb2" containerName="swift-ring-rebalance" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.696227 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdd7a8c-6647-4c8d-b888-94f715861cb2" containerName="swift-ring-rebalance" Mar 09 16:32:04 crc kubenswrapper[4831]: E0309 16:32:04.696248 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd99a0c8-1885-451a-adfd-9b493fbf5e7f" containerName="oc" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.696256 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd99a0c8-1885-451a-adfd-9b493fbf5e7f" containerName="oc" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.696456 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdd7a8c-6647-4c8d-b888-94f715861cb2" containerName="swift-ring-rebalance" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.696490 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd99a0c8-1885-451a-adfd-9b493fbf5e7f" containerName="oc" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.697068 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.699700 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.701195 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.707340 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cr67d"] Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796000 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796052 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6hd\" (UniqueName: \"kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796161 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796205 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.796286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.898237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6hd\" (UniqueName: \"kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.898820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.898896 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.898989 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.899017 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.899241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.899924 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.900560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.900777 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.905206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.905382 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:04 crc kubenswrapper[4831]: I0309 16:32:04.938441 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6hd\" (UniqueName: \"kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd\") pod \"swift-ring-rebalance-debug-cr67d\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.022217 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.250440 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" event={"ID":"cd99a0c8-1885-451a-adfd-9b493fbf5e7f","Type":"ContainerDied","Data":"c01ed81e0fd828d00dc098f908dd10d60ab708feec024a7f43af7e318ed67a13"} Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.250488 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c01ed81e0fd828d00dc098f908dd10d60ab708feec024a7f43af7e318ed67a13" Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.250509 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551232-d4cc9" Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.285577 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551226-r5g4z"] Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.292156 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551226-r5g4z"] Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.416509 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cr67d"] Mar 09 16:32:05 crc kubenswrapper[4831]: W0309 16:32:05.426116 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46b84df_b722_4db8_90b5_aedc12fb0a6c.slice/crio-1f64effee71a0d9a262e43afe723942ebf179f7b4bd99fa9a0c47a029a31ed4e WatchSource:0}: Error finding container 1f64effee71a0d9a262e43afe723942ebf179f7b4bd99fa9a0c47a029a31ed4e: Status 404 returned error can't find the container with id 1f64effee71a0d9a262e43afe723942ebf179f7b4bd99fa9a0c47a029a31ed4e Mar 09 16:32:05 crc kubenswrapper[4831]: I0309 16:32:05.628713 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718df0bb-6cad-4ce2-821d-f67fc014f745" path="/var/lib/kubelet/pods/718df0bb-6cad-4ce2-821d-f67fc014f745/volumes" Mar 09 16:32:06 crc kubenswrapper[4831]: I0309 16:32:06.259604 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" event={"ID":"f46b84df-b722-4db8-90b5-aedc12fb0a6c","Type":"ContainerStarted","Data":"a143a0caffcacb833733fd68c8aced486fe9ee7fd9f3f5b408341967a8b87283"} Mar 09 16:32:06 crc kubenswrapper[4831]: I0309 16:32:06.259907 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" event={"ID":"f46b84df-b722-4db8-90b5-aedc12fb0a6c","Type":"ContainerStarted","Data":"1f64effee71a0d9a262e43afe723942ebf179f7b4bd99fa9a0c47a029a31ed4e"} Mar 09 16:32:07 crc kubenswrapper[4831]: I0309 16:32:07.269849 4831 generic.go:334] "Generic (PLEG): container finished" podID="f46b84df-b722-4db8-90b5-aedc12fb0a6c" containerID="a143a0caffcacb833733fd68c8aced486fe9ee7fd9f3f5b408341967a8b87283" exitCode=0 Mar 09 16:32:07 crc kubenswrapper[4831]: I0309 16:32:07.269904 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" event={"ID":"f46b84df-b722-4db8-90b5-aedc12fb0a6c","Type":"ContainerDied","Data":"a143a0caffcacb833733fd68c8aced486fe9ee7fd9f3f5b408341967a8b87283"} Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.605198 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.635954 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cr67d"] Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.643779 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cr67d"] Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c6hd\" (UniqueName: \"kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661495 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661551 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.661660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts\") pod \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\" (UID: \"f46b84df-b722-4db8-90b5-aedc12fb0a6c\") " Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.663610 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.663774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.667545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd" (OuterVolumeSpecName: "kube-api-access-8c6hd") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "kube-api-access-8c6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.680069 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts" (OuterVolumeSpecName: "scripts") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.684682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.688373 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f46b84df-b722-4db8-90b5-aedc12fb0a6c" (UID: "f46b84df-b722-4db8-90b5-aedc12fb0a6c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763558 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c6hd\" (UniqueName: \"kubernetes.io/projected/f46b84df-b722-4db8-90b5-aedc12fb0a6c-kube-api-access-8c6hd\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763600 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763614 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f46b84df-b722-4db8-90b5-aedc12fb0a6c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763625 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763637 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f46b84df-b722-4db8-90b5-aedc12fb0a6c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:08 crc kubenswrapper[4831]: I0309 16:32:08.763649 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f46b84df-b722-4db8-90b5-aedc12fb0a6c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.290601 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f64effee71a0d9a262e43afe723942ebf179f7b4bd99fa9a0c47a029a31ed4e" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.290667 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cr67d" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.632196 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46b84df-b722-4db8-90b5-aedc12fb0a6c" path="/var/lib/kubelet/pods/f46b84df-b722-4db8-90b5-aedc12fb0a6c/volumes" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.827781 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc"] Mar 09 16:32:09 crc kubenswrapper[4831]: E0309 16:32:09.828135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46b84df-b722-4db8-90b5-aedc12fb0a6c" containerName="swift-ring-rebalance" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.828157 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46b84df-b722-4db8-90b5-aedc12fb0a6c" containerName="swift-ring-rebalance" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.828367 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46b84df-b722-4db8-90b5-aedc12fb0a6c" containerName="swift-ring-rebalance" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.829014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.832160 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.833443 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.835672 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc"] Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.881359 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdh9\" (UniqueName: \"kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983493 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983599 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983763 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdh9\" (UniqueName: \"kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.983961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.985253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.985317 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.986666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.988798 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:09 crc kubenswrapper[4831]: I0309 16:32:09.988875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:10 crc kubenswrapper[4831]: I0309 16:32:10.006438 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdh9\" (UniqueName: \"kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9\") pod \"swift-ring-rebalance-debug-sz9xc\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:10 crc kubenswrapper[4831]: I0309 16:32:10.147893 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:10 crc kubenswrapper[4831]: I0309 16:32:10.615646 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc"] Mar 09 16:32:11 crc kubenswrapper[4831]: I0309 16:32:11.309331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" event={"ID":"ac41b9a3-d45a-4e34-9339-2943091b3886","Type":"ContainerStarted","Data":"2fbdc2fd7e117df0425a41149a54360e9bd8b963cf79b620954925cd693eebc0"} Mar 09 16:32:11 crc kubenswrapper[4831]: I0309 16:32:11.309673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" event={"ID":"ac41b9a3-d45a-4e34-9339-2943091b3886","Type":"ContainerStarted","Data":"43f8c515e053a22e95b227039f84280fd8d0484c80a03c3024bec52663cf8c8a"} Mar 09 16:32:11 crc kubenswrapper[4831]: I0309 16:32:11.326702 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" podStartSLOduration=2.326685308 podStartE2EDuration="2.326685308s" podCreationTimestamp="2026-03-09 16:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:11.323815037 +0000 UTC m=+2058.457497460" watchObservedRunningTime="2026-03-09 16:32:11.326685308 +0000 UTC m=+2058.460367731" Mar 09 16:32:12 crc kubenswrapper[4831]: I0309 16:32:12.320138 4831 generic.go:334] "Generic (PLEG): container finished" podID="ac41b9a3-d45a-4e34-9339-2943091b3886" containerID="2fbdc2fd7e117df0425a41149a54360e9bd8b963cf79b620954925cd693eebc0" exitCode=0 Mar 09 16:32:12 crc kubenswrapper[4831]: I0309 16:32:12.320222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" event={"ID":"ac41b9a3-d45a-4e34-9339-2943091b3886","Type":"ContainerDied","Data":"2fbdc2fd7e117df0425a41149a54360e9bd8b963cf79b620954925cd693eebc0"} Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.668341 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.699902 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc"] Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.708471 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc"] Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.740714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.740801 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.740849 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.740942 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.741001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.741017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdh9\" (UniqueName: \"kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9\") pod \"ac41b9a3-d45a-4e34-9339-2943091b3886\" (UID: \"ac41b9a3-d45a-4e34-9339-2943091b3886\") " Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.741983 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.742194 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.746326 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9" (OuterVolumeSpecName: "kube-api-access-vgdh9") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "kube-api-access-vgdh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.760706 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts" (OuterVolumeSpecName: "scripts") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.765539 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.765649 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ac41b9a3-d45a-4e34-9339-2943091b3886" (UID: "ac41b9a3-d45a-4e34-9339-2943091b3886"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843211 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac41b9a3-d45a-4e34-9339-2943091b3886-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843280 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdh9\" (UniqueName: \"kubernetes.io/projected/ac41b9a3-d45a-4e34-9339-2943091b3886-kube-api-access-vgdh9\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843296 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843309 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41b9a3-d45a-4e34-9339-2943091b3886-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843322 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:13 crc kubenswrapper[4831]: I0309 16:32:13.843333 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac41b9a3-d45a-4e34-9339-2943091b3886-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.336928 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43f8c515e053a22e95b227039f84280fd8d0484c80a03c3024bec52663cf8c8a" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.336987 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sz9xc" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.848144 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z"] Mar 09 16:32:14 crc kubenswrapper[4831]: E0309 16:32:14.848715 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac41b9a3-d45a-4e34-9339-2943091b3886" containerName="swift-ring-rebalance" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.848747 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac41b9a3-d45a-4e34-9339-2943091b3886" containerName="swift-ring-rebalance" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.849026 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac41b9a3-d45a-4e34-9339-2943091b3886" containerName="swift-ring-rebalance" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.849868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.852546 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.852750 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.859331 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z"] Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961090 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961131 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khzxx\" (UniqueName: \"kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961311 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:14 crc kubenswrapper[4831]: I0309 16:32:14.961376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063015 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063064 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khzxx\" (UniqueName: \"kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.063248 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.064119 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.064234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.064499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.068657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.070256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.081481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khzxx\" (UniqueName: \"kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx\") pod \"swift-ring-rebalance-debug-n5c6z\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.172816 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.581242 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z"] Mar 09 16:32:15 crc kubenswrapper[4831]: W0309 16:32:15.584753 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee0b7c0_e613_4bba_94e7_a273c0619276.slice/crio-85b383c49b2f6c8bb949a6b4a8dd760b7ddafe8bcd892a424176652b4b50f5b3 WatchSource:0}: Error finding container 85b383c49b2f6c8bb949a6b4a8dd760b7ddafe8bcd892a424176652b4b50f5b3: Status 404 returned error can't find the container with id 85b383c49b2f6c8bb949a6b4a8dd760b7ddafe8bcd892a424176652b4b50f5b3 Mar 09 16:32:15 crc kubenswrapper[4831]: I0309 16:32:15.627946 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac41b9a3-d45a-4e34-9339-2943091b3886" path="/var/lib/kubelet/pods/ac41b9a3-d45a-4e34-9339-2943091b3886/volumes" Mar 09 16:32:16 crc kubenswrapper[4831]: I0309 16:32:16.355253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" event={"ID":"dee0b7c0-e613-4bba-94e7-a273c0619276","Type":"ContainerStarted","Data":"81314b12cb02d56273ca18382d65297de17eeb58d0b021a430245c48d86ff4f6"} Mar 09 16:32:16 crc kubenswrapper[4831]: I0309 16:32:16.355593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" event={"ID":"dee0b7c0-e613-4bba-94e7-a273c0619276","Type":"ContainerStarted","Data":"85b383c49b2f6c8bb949a6b4a8dd760b7ddafe8bcd892a424176652b4b50f5b3"} Mar 09 16:32:16 crc kubenswrapper[4831]: I0309 16:32:16.380593 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" podStartSLOduration=2.380569154 podStartE2EDuration="2.380569154s" podCreationTimestamp="2026-03-09 16:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:16.379309638 +0000 UTC m=+2063.512992061" watchObservedRunningTime="2026-03-09 16:32:16.380569154 +0000 UTC m=+2063.514251587" Mar 09 16:32:17 crc kubenswrapper[4831]: I0309 16:32:17.364062 4831 generic.go:334] "Generic (PLEG): container finished" podID="dee0b7c0-e613-4bba-94e7-a273c0619276" containerID="81314b12cb02d56273ca18382d65297de17eeb58d0b021a430245c48d86ff4f6" exitCode=0 Mar 09 16:32:17 crc kubenswrapper[4831]: I0309 16:32:17.364169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" event={"ID":"dee0b7c0-e613-4bba-94e7-a273c0619276","Type":"ContainerDied","Data":"81314b12cb02d56273ca18382d65297de17eeb58d0b021a430245c48d86ff4f6"} Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.665689 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.707195 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z"] Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.711843 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z"] Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717548 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717788 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.717873 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khzxx\" (UniqueName: \"kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx\") pod \"dee0b7c0-e613-4bba-94e7-a273c0619276\" (UID: \"dee0b7c0-e613-4bba-94e7-a273c0619276\") " Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.719667 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.719870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.734825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx" (OuterVolumeSpecName: "kube-api-access-khzxx") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "kube-api-access-khzxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.738989 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts" (OuterVolumeSpecName: "scripts") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.742009 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.746991 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dee0b7c0-e613-4bba-94e7-a273c0619276" (UID: "dee0b7c0-e613-4bba-94e7-a273c0619276"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819344 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dee0b7c0-e613-4bba-94e7-a273c0619276-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819386 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819403 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khzxx\" (UniqueName: \"kubernetes.io/projected/dee0b7c0-e613-4bba-94e7-a273c0619276-kube-api-access-khzxx\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819426 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819435 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dee0b7c0-e613-4bba-94e7-a273c0619276-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:18 crc kubenswrapper[4831]: I0309 16:32:18.819442 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee0b7c0-e613-4bba-94e7-a273c0619276-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.381450 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b383c49b2f6c8bb949a6b4a8dd760b7ddafe8bcd892a424176652b4b50f5b3" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.381457 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n5c6z" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.631912 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee0b7c0-e613-4bba-94e7-a273c0619276" path="/var/lib/kubelet/pods/dee0b7c0-e613-4bba-94e7-a273c0619276/volumes" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.827313 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl"] Mar 09 16:32:19 crc kubenswrapper[4831]: E0309 16:32:19.827642 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee0b7c0-e613-4bba-94e7-a273c0619276" containerName="swift-ring-rebalance" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.827658 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee0b7c0-e613-4bba-94e7-a273c0619276" containerName="swift-ring-rebalance" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.827836 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee0b7c0-e613-4bba-94e7-a273c0619276" containerName="swift-ring-rebalance" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.828323 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.832583 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.832919 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.841110 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl"] Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.936556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.936683 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.936743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmf58\" (UniqueName: \"kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.936920 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.936975 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:19 crc kubenswrapper[4831]: I0309 16:32:19.937045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.038642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.038782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.038858 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.038903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmf58\" (UniqueName: \"kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.039056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.039096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.039961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.040323 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.040357 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.050094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.050486 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.065126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmf58\" (UniqueName: \"kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58\") pod \"swift-ring-rebalance-debug-wzmwl\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.144679 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:20 crc kubenswrapper[4831]: I0309 16:32:20.583067 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl"] Mar 09 16:32:20 crc kubenswrapper[4831]: W0309 16:32:20.585349 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode128a581_31d3_4b08_ac05_bce480c31f6b.slice/crio-10848ef93aa27eefcbd80c53be4b2fddea2e59c5da5f735e2697b33ebb7fb23b WatchSource:0}: Error finding container 10848ef93aa27eefcbd80c53be4b2fddea2e59c5da5f735e2697b33ebb7fb23b: Status 404 returned error can't find the container with id 10848ef93aa27eefcbd80c53be4b2fddea2e59c5da5f735e2697b33ebb7fb23b Mar 09 16:32:21 crc kubenswrapper[4831]: I0309 16:32:21.398310 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" event={"ID":"e128a581-31d3-4b08-ac05-bce480c31f6b","Type":"ContainerStarted","Data":"e2d9ec11d3ad904b054bcebfea6c1231955a2383528fefb1b6b1a50ae023e6af"} Mar 09 16:32:21 crc kubenswrapper[4831]: I0309 16:32:21.398720 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" event={"ID":"e128a581-31d3-4b08-ac05-bce480c31f6b","Type":"ContainerStarted","Data":"10848ef93aa27eefcbd80c53be4b2fddea2e59c5da5f735e2697b33ebb7fb23b"} Mar 09 16:32:21 crc kubenswrapper[4831]: I0309 16:32:21.431792 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" podStartSLOduration=2.430035375 podStartE2EDuration="2.430035375s" podCreationTimestamp="2026-03-09 16:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:21.418300212 +0000 UTC m=+2068.551982635" watchObservedRunningTime="2026-03-09 16:32:21.430035375 +0000 UTC m=+2068.563717808" Mar 09 16:32:22 crc kubenswrapper[4831]: I0309 16:32:22.406831 4831 generic.go:334] "Generic (PLEG): container finished" podID="e128a581-31d3-4b08-ac05-bce480c31f6b" containerID="e2d9ec11d3ad904b054bcebfea6c1231955a2383528fefb1b6b1a50ae023e6af" exitCode=0 Mar 09 16:32:22 crc kubenswrapper[4831]: I0309 16:32:22.406875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" event={"ID":"e128a581-31d3-4b08-ac05-bce480c31f6b","Type":"ContainerDied","Data":"e2d9ec11d3ad904b054bcebfea6c1231955a2383528fefb1b6b1a50ae023e6af"} Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.697605 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.731584 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl"] Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.737076 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl"] Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800277 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800427 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800603 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmf58\" (UniqueName: \"kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800687 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.800748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices\") pod \"e128a581-31d3-4b08-ac05-bce480c31f6b\" (UID: \"e128a581-31d3-4b08-ac05-bce480c31f6b\") " Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.802126 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.802287 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.807115 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58" (OuterVolumeSpecName: "kube-api-access-rmf58") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "kube-api-access-rmf58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.822160 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.822825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts" (OuterVolumeSpecName: "scripts") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.826197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e128a581-31d3-4b08-ac05-bce480c31f6b" (UID: "e128a581-31d3-4b08-ac05-bce480c31f6b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.902661 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.902861 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e128a581-31d3-4b08-ac05-bce480c31f6b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.902919 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmf58\" (UniqueName: \"kubernetes.io/projected/e128a581-31d3-4b08-ac05-bce480c31f6b-kube-api-access-rmf58\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.902972 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e128a581-31d3-4b08-ac05-bce480c31f6b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.903054 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:23 crc kubenswrapper[4831]: I0309 16:32:23.903111 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e128a581-31d3-4b08-ac05-bce480c31f6b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.429650 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10848ef93aa27eefcbd80c53be4b2fddea2e59c5da5f735e2697b33ebb7fb23b" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.429852 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wzmwl" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.933385 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s"] Mar 09 16:32:24 crc kubenswrapper[4831]: E0309 16:32:24.933770 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e128a581-31d3-4b08-ac05-bce480c31f6b" containerName="swift-ring-rebalance" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.933784 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e128a581-31d3-4b08-ac05-bce480c31f6b" containerName="swift-ring-rebalance" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.933947 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e128a581-31d3-4b08-ac05-bce480c31f6b" containerName="swift-ring-rebalance" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.934436 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.937165 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.940046 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:24 crc kubenswrapper[4831]: I0309 16:32:24.945149 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s"] Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021280 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021400 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.021525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87n8\" (UniqueName: \"kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.122930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.122972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123013 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123029 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87n8\" (UniqueName: \"kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123420 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123855 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.123855 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.127749 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.128777 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.139725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87n8\" (UniqueName: \"kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8\") pod \"swift-ring-rebalance-debug-nzt7s\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.256868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.629618 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e128a581-31d3-4b08-ac05-bce480c31f6b" path="/var/lib/kubelet/pods/e128a581-31d3-4b08-ac05-bce480c31f6b/volumes" Mar 09 16:32:25 crc kubenswrapper[4831]: I0309 16:32:25.684033 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s"] Mar 09 16:32:26 crc kubenswrapper[4831]: I0309 16:32:26.451113 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" event={"ID":"131c3871-6184-4dd5-92c1-27ecfa3b1545","Type":"ContainerStarted","Data":"d762e3b6903adff76b40bb6985da7a3a154ea2035025890ef8c80c117afb8fc3"} Mar 09 16:32:26 crc kubenswrapper[4831]: I0309 16:32:26.451626 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" event={"ID":"131c3871-6184-4dd5-92c1-27ecfa3b1545","Type":"ContainerStarted","Data":"d54315df36442ec490b3a48f1537ec0a11dbb1d173c6085f02808a34b2034f28"} Mar 09 16:32:26 crc kubenswrapper[4831]: I0309 16:32:26.485352 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" podStartSLOduration=2.485302979 podStartE2EDuration="2.485302979s" podCreationTimestamp="2026-03-09 16:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:26.483314582 +0000 UTC m=+2073.616997015" watchObservedRunningTime="2026-03-09 16:32:26.485302979 +0000 UTC m=+2073.618985442" Mar 09 16:32:27 crc kubenswrapper[4831]: I0309 16:32:27.462897 4831 generic.go:334] "Generic (PLEG): container finished" podID="131c3871-6184-4dd5-92c1-27ecfa3b1545" containerID="d762e3b6903adff76b40bb6985da7a3a154ea2035025890ef8c80c117afb8fc3" exitCode=0 Mar 09 16:32:27 crc kubenswrapper[4831]: I0309 16:32:27.462934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" event={"ID":"131c3871-6184-4dd5-92c1-27ecfa3b1545","Type":"ContainerDied","Data":"d762e3b6903adff76b40bb6985da7a3a154ea2035025890ef8c80c117afb8fc3"} Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.774526 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.820603 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s"] Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.826528 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s"] Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900545 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87n8\" (UniqueName: \"kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900655 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.900687 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf\") pod \"131c3871-6184-4dd5-92c1-27ecfa3b1545\" (UID: \"131c3871-6184-4dd5-92c1-27ecfa3b1545\") " Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.902015 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.902552 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.912930 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8" (OuterVolumeSpecName: "kube-api-access-g87n8") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "kube-api-access-g87n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.921133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.924123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts" (OuterVolumeSpecName: "scripts") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:28 crc kubenswrapper[4831]: I0309 16:32:28.934467 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "131c3871-6184-4dd5-92c1-27ecfa3b1545" (UID: "131c3871-6184-4dd5-92c1-27ecfa3b1545"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002029 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/131c3871-6184-4dd5-92c1-27ecfa3b1545-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002073 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002083 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002097 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/131c3871-6184-4dd5-92c1-27ecfa3b1545-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002106 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/131c3871-6184-4dd5-92c1-27ecfa3b1545-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.002115 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g87n8\" (UniqueName: \"kubernetes.io/projected/131c3871-6184-4dd5-92c1-27ecfa3b1545-kube-api-access-g87n8\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.481802 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54315df36442ec490b3a48f1537ec0a11dbb1d173c6085f02808a34b2034f28" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.481856 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nzt7s" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.626010 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131c3871-6184-4dd5-92c1-27ecfa3b1545" path="/var/lib/kubelet/pods/131c3871-6184-4dd5-92c1-27ecfa3b1545/volumes" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.943291 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn"] Mar 09 16:32:29 crc kubenswrapper[4831]: E0309 16:32:29.943827 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131c3871-6184-4dd5-92c1-27ecfa3b1545" containerName="swift-ring-rebalance" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.943856 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="131c3871-6184-4dd5-92c1-27ecfa3b1545" containerName="swift-ring-rebalance" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.944175 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="131c3871-6184-4dd5-92c1-27ecfa3b1545" containerName="swift-ring-rebalance" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.945046 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.955722 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn"] Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.957553 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:29 crc kubenswrapper[4831]: I0309 16:32:29.958242 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016629 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wkd4\" (UniqueName: \"kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016672 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.016973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wkd4\" (UniqueName: \"kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118869 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118918 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.118964 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.119469 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.119797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.121212 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.123421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.124792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.140959 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wkd4\" (UniqueName: \"kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4\") pod \"swift-ring-rebalance-debug-nfvhn\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.274983 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:30 crc kubenswrapper[4831]: I0309 16:32:30.537699 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn"] Mar 09 16:32:30 crc kubenswrapper[4831]: W0309 16:32:30.538057 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdb76371_348e_44a9_8ed5_776ccaa14c69.slice/crio-2cbbb763e91062ef22c638ef8c89ff60ac14293c4ffdfb51b52dd32753f77e52 WatchSource:0}: Error finding container 2cbbb763e91062ef22c638ef8c89ff60ac14293c4ffdfb51b52dd32753f77e52: Status 404 returned error can't find the container with id 2cbbb763e91062ef22c638ef8c89ff60ac14293c4ffdfb51b52dd32753f77e52 Mar 09 16:32:31 crc kubenswrapper[4831]: I0309 16:32:31.505808 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" event={"ID":"bdb76371-348e-44a9-8ed5-776ccaa14c69","Type":"ContainerStarted","Data":"2fcaece23124da21d8bd80082a67824c2138721de06ab9ccccf230f6c63a0960"} Mar 09 16:32:31 crc kubenswrapper[4831]: I0309 16:32:31.506206 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" event={"ID":"bdb76371-348e-44a9-8ed5-776ccaa14c69","Type":"ContainerStarted","Data":"2cbbb763e91062ef22c638ef8c89ff60ac14293c4ffdfb51b52dd32753f77e52"} Mar 09 16:32:31 crc kubenswrapper[4831]: I0309 16:32:31.531082 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" podStartSLOduration=2.5310609939999997 podStartE2EDuration="2.531060994s" podCreationTimestamp="2026-03-09 16:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:31.525511506 +0000 UTC m=+2078.659193969" watchObservedRunningTime="2026-03-09 16:32:31.531060994 +0000 UTC m=+2078.664743437" Mar 09 16:32:32 crc kubenswrapper[4831]: I0309 16:32:32.516112 4831 generic.go:334] "Generic (PLEG): container finished" podID="bdb76371-348e-44a9-8ed5-776ccaa14c69" containerID="2fcaece23124da21d8bd80082a67824c2138721de06ab9ccccf230f6c63a0960" exitCode=0 Mar 09 16:32:32 crc kubenswrapper[4831]: I0309 16:32:32.516184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" event={"ID":"bdb76371-348e-44a9-8ed5-776ccaa14c69","Type":"ContainerDied","Data":"2fcaece23124da21d8bd80082a67824c2138721de06ab9ccccf230f6c63a0960"} Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.018843 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.018954 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.810284 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.851359 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn"] Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.857236 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn"] Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.881525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wkd4\" (UniqueName: \"kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.881786 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.881917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.882010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.882105 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.882284 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts\") pod \"bdb76371-348e-44a9-8ed5-776ccaa14c69\" (UID: \"bdb76371-348e-44a9-8ed5-776ccaa14c69\") " Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.883770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.884239 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.887446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4" (OuterVolumeSpecName: "kube-api-access-5wkd4") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "kube-api-access-5wkd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.900319 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts" (OuterVolumeSpecName: "scripts") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.903628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.904922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bdb76371-348e-44a9-8ed5-776ccaa14c69" (UID: "bdb76371-348e-44a9-8ed5-776ccaa14c69"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984623 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wkd4\" (UniqueName: \"kubernetes.io/projected/bdb76371-348e-44a9-8ed5-776ccaa14c69-kube-api-access-5wkd4\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984657 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bdb76371-348e-44a9-8ed5-776ccaa14c69-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984668 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984677 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984685 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bdb76371-348e-44a9-8ed5-776ccaa14c69-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:33 crc kubenswrapper[4831]: I0309 16:32:33.984694 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb76371-348e-44a9-8ed5-776ccaa14c69-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.538690 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbbb763e91062ef22c638ef8c89ff60ac14293c4ffdfb51b52dd32753f77e52" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.538736 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfvhn" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.983826 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq"] Mar 09 16:32:34 crc kubenswrapper[4831]: E0309 16:32:34.984170 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb76371-348e-44a9-8ed5-776ccaa14c69" containerName="swift-ring-rebalance" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.984184 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb76371-348e-44a9-8ed5-776ccaa14c69" containerName="swift-ring-rebalance" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.984367 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb76371-348e-44a9-8ed5-776ccaa14c69" containerName="swift-ring-rebalance" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.984949 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.987369 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.988303 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:34 crc kubenswrapper[4831]: I0309 16:32:34.994956 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq"] Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.102785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.102962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.103041 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.103077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6v4g\" (UniqueName: \"kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.103140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.103168 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.204880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6v4g\" (UniqueName: \"kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.204967 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205832 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.205962 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.206338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.209066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.209463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.219918 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6v4g\" (UniqueName: \"kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g\") pod \"swift-ring-rebalance-debug-xqnkq\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.302472 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.548597 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq"] Mar 09 16:32:35 crc kubenswrapper[4831]: I0309 16:32:35.628358 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb76371-348e-44a9-8ed5-776ccaa14c69" path="/var/lib/kubelet/pods/bdb76371-348e-44a9-8ed5-776ccaa14c69/volumes" Mar 09 16:32:36 crc kubenswrapper[4831]: I0309 16:32:36.554943 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" event={"ID":"9e142aad-1420-43ab-881e-5d3727a7942b","Type":"ContainerStarted","Data":"c91a527411a8f1d27d6316eb1f44eec430ae08ec957a8c34301ad0e02089fc69"} Mar 09 16:32:36 crc kubenswrapper[4831]: I0309 16:32:36.554992 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" event={"ID":"9e142aad-1420-43ab-881e-5d3727a7942b","Type":"ContainerStarted","Data":"3afba6a578019d88c29814b8131e3c9a65ec6fe2ab8420c4614d8c399b493f0d"} Mar 09 16:32:36 crc kubenswrapper[4831]: I0309 16:32:36.578335 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" podStartSLOduration=2.57831434 podStartE2EDuration="2.57831434s" podCreationTimestamp="2026-03-09 16:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:36.569749127 +0000 UTC m=+2083.703431560" watchObservedRunningTime="2026-03-09 16:32:36.57831434 +0000 UTC m=+2083.711996753" Mar 09 16:32:37 crc kubenswrapper[4831]: I0309 16:32:37.566545 4831 generic.go:334] "Generic (PLEG): container finished" podID="9e142aad-1420-43ab-881e-5d3727a7942b" containerID="c91a527411a8f1d27d6316eb1f44eec430ae08ec957a8c34301ad0e02089fc69" exitCode=0 Mar 09 16:32:37 crc kubenswrapper[4831]: I0309 16:32:37.566749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" event={"ID":"9e142aad-1420-43ab-881e-5d3727a7942b","Type":"ContainerDied","Data":"c91a527411a8f1d27d6316eb1f44eec430ae08ec957a8c34301ad0e02089fc69"} Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.848644 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.882603 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq"] Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.888304 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq"] Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.958542 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.958670 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.959282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.959350 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.959712 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.959757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6v4g\" (UniqueName: \"kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.959812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts\") pod \"9e142aad-1420-43ab-881e-5d3727a7942b\" (UID: \"9e142aad-1420-43ab-881e-5d3727a7942b\") " Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.960249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.960602 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e142aad-1420-43ab-881e-5d3727a7942b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.960627 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.963294 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g" (OuterVolumeSpecName: "kube-api-access-f6v4g") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "kube-api-access-f6v4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.977137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts" (OuterVolumeSpecName: "scripts") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.978603 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:38 crc kubenswrapper[4831]: I0309 16:32:38.980485 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9e142aad-1420-43ab-881e-5d3727a7942b" (UID: "9e142aad-1420-43ab-881e-5d3727a7942b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.062028 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6v4g\" (UniqueName: \"kubernetes.io/projected/9e142aad-1420-43ab-881e-5d3727a7942b-kube-api-access-f6v4g\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.062078 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e142aad-1420-43ab-881e-5d3727a7942b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.062097 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.062115 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e142aad-1420-43ab-881e-5d3727a7942b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.585253 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3afba6a578019d88c29814b8131e3c9a65ec6fe2ab8420c4614d8c399b493f0d" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.585347 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xqnkq" Mar 09 16:32:39 crc kubenswrapper[4831]: I0309 16:32:39.629026 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e142aad-1420-43ab-881e-5d3727a7942b" path="/var/lib/kubelet/pods/9e142aad-1420-43ab-881e-5d3727a7942b/volumes" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.013259 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf"] Mar 09 16:32:40 crc kubenswrapper[4831]: E0309 16:32:40.013745 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e142aad-1420-43ab-881e-5d3727a7942b" containerName="swift-ring-rebalance" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.013761 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e142aad-1420-43ab-881e-5d3727a7942b" containerName="swift-ring-rebalance" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.013944 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e142aad-1420-43ab-881e-5d3727a7942b" containerName="swift-ring-rebalance" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.014504 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.016923 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.017074 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.022798 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf"] Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.076574 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.076622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.076661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.076863 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.076956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdxb\" (UniqueName: \"kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.077077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179195 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179233 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179287 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179370 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.179452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdxb\" (UniqueName: \"kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.180021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.180266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.180438 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.184295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.199770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.203810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdxb\" (UniqueName: \"kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb\") pod \"swift-ring-rebalance-debug-xxgsf\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.336998 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:40 crc kubenswrapper[4831]: I0309 16:32:40.874548 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf"] Mar 09 16:32:41 crc kubenswrapper[4831]: I0309 16:32:41.601794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" event={"ID":"750eb3ef-d4cd-4aee-910c-467bcf8346b1","Type":"ContainerStarted","Data":"bd86660f9d20b5a8679ee457db628d395d5d54ad4094cb56560fe4e5c3320971"} Mar 09 16:32:41 crc kubenswrapper[4831]: I0309 16:32:41.602225 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" event={"ID":"750eb3ef-d4cd-4aee-910c-467bcf8346b1","Type":"ContainerStarted","Data":"90868b6151028842d9fda2b8ddd300a5408d46a3c351499b07e09cff3adc2493"} Mar 09 16:32:42 crc kubenswrapper[4831]: I0309 16:32:42.611745 4831 generic.go:334] "Generic (PLEG): container finished" podID="750eb3ef-d4cd-4aee-910c-467bcf8346b1" containerID="bd86660f9d20b5a8679ee457db628d395d5d54ad4094cb56560fe4e5c3320971" exitCode=0 Mar 09 16:32:42 crc kubenswrapper[4831]: I0309 16:32:42.611854 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" event={"ID":"750eb3ef-d4cd-4aee-910c-467bcf8346b1","Type":"ContainerDied","Data":"bd86660f9d20b5a8679ee457db628d395d5d54ad4094cb56560fe4e5c3320971"} Mar 09 16:32:43 crc kubenswrapper[4831]: I0309 16:32:43.958253 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:43 crc kubenswrapper[4831]: I0309 16:32:43.995447 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf"] Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.000793 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf"] Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145661 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145706 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145731 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145750 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.145812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtdxb\" (UniqueName: \"kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb\") pod \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\" (UID: \"750eb3ef-d4cd-4aee-910c-467bcf8346b1\") " Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.146246 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.146853 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.151201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb" (OuterVolumeSpecName: "kube-api-access-vtdxb") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "kube-api-access-vtdxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.166955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts" (OuterVolumeSpecName: "scripts") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.168386 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.169742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "750eb3ef-d4cd-4aee-910c-467bcf8346b1" (UID: "750eb3ef-d4cd-4aee-910c-467bcf8346b1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247230 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247273 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/750eb3ef-d4cd-4aee-910c-467bcf8346b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247301 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247310 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/750eb3ef-d4cd-4aee-910c-467bcf8346b1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247318 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/750eb3ef-d4cd-4aee-910c-467bcf8346b1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.247326 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtdxb\" (UniqueName: \"kubernetes.io/projected/750eb3ef-d4cd-4aee-910c-467bcf8346b1-kube-api-access-vtdxb\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.628368 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90868b6151028842d9fda2b8ddd300a5408d46a3c351499b07e09cff3adc2493" Mar 09 16:32:44 crc kubenswrapper[4831]: I0309 16:32:44.628498 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xxgsf" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.150626 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-72z74"] Mar 09 16:32:45 crc kubenswrapper[4831]: E0309 16:32:45.150956 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750eb3ef-d4cd-4aee-910c-467bcf8346b1" containerName="swift-ring-rebalance" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.150981 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="750eb3ef-d4cd-4aee-910c-467bcf8346b1" containerName="swift-ring-rebalance" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.151217 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="750eb3ef-d4cd-4aee-910c-467bcf8346b1" containerName="swift-ring-rebalance" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.151899 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.154662 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.160695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.160983 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.161179 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.161397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.161675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.162047 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npzj\" (UniqueName: \"kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.162589 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.165007 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-72z74"] Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263473 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.263559 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npzj\" (UniqueName: \"kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.264515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.264535 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.264597 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.267234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.268130 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.280099 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npzj\" (UniqueName: \"kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj\") pod \"swift-ring-rebalance-debug-72z74\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.479052 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.626349 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750eb3ef-d4cd-4aee-910c-467bcf8346b1" path="/var/lib/kubelet/pods/750eb3ef-d4cd-4aee-910c-467bcf8346b1/volumes" Mar 09 16:32:45 crc kubenswrapper[4831]: I0309 16:32:45.928838 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-72z74"] Mar 09 16:32:46 crc kubenswrapper[4831]: I0309 16:32:46.649879 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" event={"ID":"5d226d5a-4969-40f0-bbe4-587dcd33fe99","Type":"ContainerStarted","Data":"ce85c8fc4b0a6de223f5b5434f0d5246f3d1efaa7794b3c3a0ba147a325e2692"} Mar 09 16:32:46 crc kubenswrapper[4831]: I0309 16:32:46.650456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" event={"ID":"5d226d5a-4969-40f0-bbe4-587dcd33fe99","Type":"ContainerStarted","Data":"e3de310eab2dc5ea80f82a63d74e015ac9ab568e6994a6e56e42afb680713ec7"} Mar 09 16:32:46 crc kubenswrapper[4831]: I0309 16:32:46.683065 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" podStartSLOduration=1.683040613 podStartE2EDuration="1.683040613s" podCreationTimestamp="2026-03-09 16:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:46.673393219 +0000 UTC m=+2093.807075642" watchObservedRunningTime="2026-03-09 16:32:46.683040613 +0000 UTC m=+2093.816723036" Mar 09 16:32:47 crc kubenswrapper[4831]: I0309 16:32:47.660502 4831 generic.go:334] "Generic (PLEG): container finished" podID="5d226d5a-4969-40f0-bbe4-587dcd33fe99" containerID="ce85c8fc4b0a6de223f5b5434f0d5246f3d1efaa7794b3c3a0ba147a325e2692" exitCode=0 Mar 09 16:32:47 crc kubenswrapper[4831]: I0309 16:32:47.660862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" event={"ID":"5d226d5a-4969-40f0-bbe4-587dcd33fe99","Type":"ContainerDied","Data":"ce85c8fc4b0a6de223f5b5434f0d5246f3d1efaa7794b3c3a0ba147a325e2692"} Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.037584 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.074443 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-72z74"] Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.083141 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-72z74"] Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.220235 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.220465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6npzj\" (UniqueName: \"kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.220526 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.220573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.220622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.221044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf\") pod \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\" (UID: \"5d226d5a-4969-40f0-bbe4-587dcd33fe99\") " Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.221061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.221658 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.222653 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d226d5a-4969-40f0-bbe4-587dcd33fe99-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.222686 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.226778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj" (OuterVolumeSpecName: "kube-api-access-6npzj") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "kube-api-access-6npzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.241049 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts" (OuterVolumeSpecName: "scripts") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.243211 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.257649 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5d226d5a-4969-40f0-bbe4-587dcd33fe99" (UID: "5d226d5a-4969-40f0-bbe4-587dcd33fe99"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.324199 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.324243 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d226d5a-4969-40f0-bbe4-587dcd33fe99-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.324258 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6npzj\" (UniqueName: \"kubernetes.io/projected/5d226d5a-4969-40f0-bbe4-587dcd33fe99-kube-api-access-6npzj\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.324269 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d226d5a-4969-40f0-bbe4-587dcd33fe99-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.625444 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d226d5a-4969-40f0-bbe4-587dcd33fe99" path="/var/lib/kubelet/pods/5d226d5a-4969-40f0-bbe4-587dcd33fe99/volumes" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.678269 4831 scope.go:117] "RemoveContainer" containerID="ce85c8fc4b0a6de223f5b5434f0d5246f3d1efaa7794b3c3a0ba147a325e2692" Mar 09 16:32:49 crc kubenswrapper[4831]: I0309 16:32:49.678329 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-72z74" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.243249 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm"] Mar 09 16:32:50 crc kubenswrapper[4831]: E0309 16:32:50.244438 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d226d5a-4969-40f0-bbe4-587dcd33fe99" containerName="swift-ring-rebalance" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.244458 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d226d5a-4969-40f0-bbe4-587dcd33fe99" containerName="swift-ring-rebalance" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.244683 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d226d5a-4969-40f0-bbe4-587dcd33fe99" containerName="swift-ring-rebalance" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.245329 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.248289 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.248831 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.259078 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm"] Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.441373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8n8\" (UniqueName: \"kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.543356 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.543609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8n8\" (UniqueName: \"kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.543708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.543819 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.543931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.544035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.544363 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.545494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.545575 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.550008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.550102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.558897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8n8\" (UniqueName: \"kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8\") pod \"swift-ring-rebalance-debug-wm9fm\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.561883 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:50 crc kubenswrapper[4831]: I0309 16:32:50.772439 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm"] Mar 09 16:32:50 crc kubenswrapper[4831]: W0309 16:32:50.775660 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3213ec5a_de3f_4686_9363_785046c860f3.slice/crio-4c8fbe4a2fa432c06bc9b84763733b57575356fb2a9b44d09b631e131129ed67 WatchSource:0}: Error finding container 4c8fbe4a2fa432c06bc9b84763733b57575356fb2a9b44d09b631e131129ed67: Status 404 returned error can't find the container with id 4c8fbe4a2fa432c06bc9b84763733b57575356fb2a9b44d09b631e131129ed67 Mar 09 16:32:51 crc kubenswrapper[4831]: I0309 16:32:51.711644 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" event={"ID":"3213ec5a-de3f-4686-9363-785046c860f3","Type":"ContainerStarted","Data":"1df6a6ba8f4388be88363a7276537f409bcf5575dd95acac3153458132da19e0"} Mar 09 16:32:51 crc kubenswrapper[4831]: I0309 16:32:51.711921 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" event={"ID":"3213ec5a-de3f-4686-9363-785046c860f3","Type":"ContainerStarted","Data":"4c8fbe4a2fa432c06bc9b84763733b57575356fb2a9b44d09b631e131129ed67"} Mar 09 16:32:51 crc kubenswrapper[4831]: I0309 16:32:51.728187 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" podStartSLOduration=1.7281635400000002 podStartE2EDuration="1.72816354s" podCreationTimestamp="2026-03-09 16:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:51.725312379 +0000 UTC m=+2098.858994812" watchObservedRunningTime="2026-03-09 16:32:51.72816354 +0000 UTC m=+2098.861845973" Mar 09 16:32:52 crc kubenswrapper[4831]: I0309 16:32:52.719973 4831 generic.go:334] "Generic (PLEG): container finished" podID="3213ec5a-de3f-4686-9363-785046c860f3" containerID="1df6a6ba8f4388be88363a7276537f409bcf5575dd95acac3153458132da19e0" exitCode=0 Mar 09 16:32:52 crc kubenswrapper[4831]: I0309 16:32:52.720026 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" event={"ID":"3213ec5a-de3f-4686-9363-785046c860f3","Type":"ContainerDied","Data":"1df6a6ba8f4388be88363a7276537f409bcf5575dd95acac3153458132da19e0"} Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.025184 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.062836 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm"] Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.070469 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm"] Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212133 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk8n8\" (UniqueName: \"kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212632 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212665 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.212823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf\") pod \"3213ec5a-de3f-4686-9363-785046c860f3\" (UID: \"3213ec5a-de3f-4686-9363-785046c860f3\") " Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.213299 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.213323 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.218637 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8" (OuterVolumeSpecName: "kube-api-access-wk8n8") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "kube-api-access-wk8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.234121 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts" (OuterVolumeSpecName: "scripts") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.239737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.243234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3213ec5a-de3f-4686-9363-785046c860f3" (UID: "3213ec5a-de3f-4686-9363-785046c860f3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314752 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314784 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314797 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk8n8\" (UniqueName: \"kubernetes.io/projected/3213ec5a-de3f-4686-9363-785046c860f3-kube-api-access-wk8n8\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314806 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3213ec5a-de3f-4686-9363-785046c860f3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314814 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3213ec5a-de3f-4686-9363-785046c860f3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.314823 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3213ec5a-de3f-4686-9363-785046c860f3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.738388 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8fbe4a2fa432c06bc9b84763733b57575356fb2a9b44d09b631e131129ed67" Mar 09 16:32:54 crc kubenswrapper[4831]: I0309 16:32:54.738783 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wm9fm" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.223835 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7"] Mar 09 16:32:55 crc kubenswrapper[4831]: E0309 16:32:55.224786 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3213ec5a-de3f-4686-9363-785046c860f3" containerName="swift-ring-rebalance" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.224804 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3213ec5a-de3f-4686-9363-785046c860f3" containerName="swift-ring-rebalance" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.234877 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3213ec5a-de3f-4686-9363-785046c860f3" containerName="swift-ring-rebalance" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.236228 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.243183 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.243357 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.264813 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7"] Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.438740 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.438790 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmsm5\" (UniqueName: \"kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.438814 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.438851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.438976 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.439027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540323 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540387 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmsm5\" (UniqueName: \"kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540451 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.540642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.541755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.542244 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.542422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.545002 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.551094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.563678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmsm5\" (UniqueName: \"kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5\") pod \"swift-ring-rebalance-debug-ndqz7\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.569496 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:55 crc kubenswrapper[4831]: I0309 16:32:55.637162 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3213ec5a-de3f-4686-9363-785046c860f3" path="/var/lib/kubelet/pods/3213ec5a-de3f-4686-9363-785046c860f3/volumes" Mar 09 16:32:56 crc kubenswrapper[4831]: I0309 16:32:56.053746 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7"] Mar 09 16:32:56 crc kubenswrapper[4831]: W0309 16:32:56.060501 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b87073_f21b_4330_b9f8_711d553b5a88.slice/crio-a1e9494cb425800c10b0c00165916b6799842fd4d770c8e56c19fc17c6ecacea WatchSource:0}: Error finding container a1e9494cb425800c10b0c00165916b6799842fd4d770c8e56c19fc17c6ecacea: Status 404 returned error can't find the container with id a1e9494cb425800c10b0c00165916b6799842fd4d770c8e56c19fc17c6ecacea Mar 09 16:32:56 crc kubenswrapper[4831]: I0309 16:32:56.755823 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" event={"ID":"d2b87073-f21b-4330-b9f8-711d553b5a88","Type":"ContainerStarted","Data":"aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2"} Mar 09 16:32:56 crc kubenswrapper[4831]: I0309 16:32:56.756143 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" event={"ID":"d2b87073-f21b-4330-b9f8-711d553b5a88","Type":"ContainerStarted","Data":"a1e9494cb425800c10b0c00165916b6799842fd4d770c8e56c19fc17c6ecacea"} Mar 09 16:32:56 crc kubenswrapper[4831]: I0309 16:32:56.777331 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" podStartSLOduration=1.777316882 podStartE2EDuration="1.777316882s" podCreationTimestamp="2026-03-09 16:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:32:56.775518711 +0000 UTC m=+2103.909201144" watchObservedRunningTime="2026-03-09 16:32:56.777316882 +0000 UTC m=+2103.910999305" Mar 09 16:32:57 crc kubenswrapper[4831]: E0309 16:32:57.587048 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b87073_f21b_4330_b9f8_711d553b5a88.slice/crio-conmon-aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b87073_f21b_4330_b9f8_711d553b5a88.slice/crio-aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2.scope\": RecentStats: unable to find data in memory cache]" Mar 09 16:32:57 crc kubenswrapper[4831]: I0309 16:32:57.765094 4831 generic.go:334] "Generic (PLEG): container finished" podID="d2b87073-f21b-4330-b9f8-711d553b5a88" containerID="aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2" exitCode=0 Mar 09 16:32:57 crc kubenswrapper[4831]: I0309 16:32:57.765161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" event={"ID":"d2b87073-f21b-4330-b9f8-711d553b5a88","Type":"ContainerDied","Data":"aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2"} Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.054366 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.103428 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7"] Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.112916 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7"] Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196447 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196691 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.196886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmsm5\" (UniqueName: \"kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5\") pod \"d2b87073-f21b-4330-b9f8-711d553b5a88\" (UID: \"d2b87073-f21b-4330-b9f8-711d553b5a88\") " Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.197970 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.198109 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.202928 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5" (OuterVolumeSpecName: "kube-api-access-hmsm5") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "kube-api-access-hmsm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.221161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts" (OuterVolumeSpecName: "scripts") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.223461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.233677 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d2b87073-f21b-4330-b9f8-711d553b5a88" (UID: "d2b87073-f21b-4330-b9f8-711d553b5a88"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298247 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmsm5\" (UniqueName: \"kubernetes.io/projected/d2b87073-f21b-4330-b9f8-711d553b5a88-kube-api-access-hmsm5\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298281 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d2b87073-f21b-4330-b9f8-711d553b5a88-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298293 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298302 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d2b87073-f21b-4330-b9f8-711d553b5a88-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298311 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.298319 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d2b87073-f21b-4330-b9f8-711d553b5a88-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.627100 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b87073-f21b-4330-b9f8-711d553b5a88" path="/var/lib/kubelet/pods/d2b87073-f21b-4330-b9f8-711d553b5a88/volumes" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.784745 4831 scope.go:117] "RemoveContainer" containerID="aa244e2bce21147eb28383e353fe8006d0bca2a6a4a8a71cddee2e1cf68388e2" Mar 09 16:32:59 crc kubenswrapper[4831]: I0309 16:32:59.784779 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndqz7" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.242769 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-spkl2"] Mar 09 16:33:00 crc kubenswrapper[4831]: E0309 16:33:00.243480 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b87073-f21b-4330-b9f8-711d553b5a88" containerName="swift-ring-rebalance" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.243497 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b87073-f21b-4330-b9f8-711d553b5a88" containerName="swift-ring-rebalance" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.243707 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b87073-f21b-4330-b9f8-711d553b5a88" containerName="swift-ring-rebalance" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.244253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.246714 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.247140 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.255890 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-spkl2"] Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.316926 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.316979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.317012 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.317054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.317089 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjng\" (UniqueName: \"kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.317157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418714 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjng\" (UniqueName: \"kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.418775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.419499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.419886 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.421134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.423545 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.424132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.436752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjng\" (UniqueName: \"kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng\") pod \"swift-ring-rebalance-debug-spkl2\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:00 crc kubenswrapper[4831]: I0309 16:33:00.581955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:01 crc kubenswrapper[4831]: I0309 16:33:01.079697 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-spkl2"] Mar 09 16:33:01 crc kubenswrapper[4831]: W0309 16:33:01.086451 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe4a5615_0f46_4d8c_90dc_eb33c3859d3f.slice/crio-02704a47132b9cabd7645767b75ac15bcd17a753c849e36c5704f31e49104306 WatchSource:0}: Error finding container 02704a47132b9cabd7645767b75ac15bcd17a753c849e36c5704f31e49104306: Status 404 returned error can't find the container with id 02704a47132b9cabd7645767b75ac15bcd17a753c849e36c5704f31e49104306 Mar 09 16:33:01 crc kubenswrapper[4831]: I0309 16:33:01.809944 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" event={"ID":"be4a5615-0f46-4d8c-90dc-eb33c3859d3f","Type":"ContainerStarted","Data":"f0e58a0ad94be442ea8ea9f6825dc25881be3e38ea4b5f3df96108f53d7cae92"} Mar 09 16:33:01 crc kubenswrapper[4831]: I0309 16:33:01.810260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" event={"ID":"be4a5615-0f46-4d8c-90dc-eb33c3859d3f","Type":"ContainerStarted","Data":"02704a47132b9cabd7645767b75ac15bcd17a753c849e36c5704f31e49104306"} Mar 09 16:33:01 crc kubenswrapper[4831]: I0309 16:33:01.826157 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" podStartSLOduration=1.826135223 podStartE2EDuration="1.826135223s" podCreationTimestamp="2026-03-09 16:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:01.825232327 +0000 UTC m=+2108.958914750" watchObservedRunningTime="2026-03-09 16:33:01.826135223 +0000 UTC m=+2108.959817656" Mar 09 16:33:02 crc kubenswrapper[4831]: I0309 16:33:02.537799 4831 scope.go:117] "RemoveContainer" containerID="31ed55511a0905a9add38b6d1dae8c24946886d3738605f0712a9cc3ebe88431" Mar 09 16:33:02 crc kubenswrapper[4831]: I0309 16:33:02.828103 4831 generic.go:334] "Generic (PLEG): container finished" podID="be4a5615-0f46-4d8c-90dc-eb33c3859d3f" containerID="f0e58a0ad94be442ea8ea9f6825dc25881be3e38ea4b5f3df96108f53d7cae92" exitCode=0 Mar 09 16:33:02 crc kubenswrapper[4831]: I0309 16:33:02.828193 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" event={"ID":"be4a5615-0f46-4d8c-90dc-eb33c3859d3f","Type":"ContainerDied","Data":"f0e58a0ad94be442ea8ea9f6825dc25881be3e38ea4b5f3df96108f53d7cae92"} Mar 09 16:33:03 crc kubenswrapper[4831]: I0309 16:33:03.018521 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:33:03 crc kubenswrapper[4831]: I0309 16:33:03.018575 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.072416 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083639 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083677 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083692 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.083736 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjng\" (UniqueName: \"kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng\") pod \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\" (UID: \"be4a5615-0f46-4d8c-90dc-eb33c3859d3f\") " Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.085892 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.086166 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.102156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng" (OuterVolumeSpecName: "kube-api-access-2jjng") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "kube-api-access-2jjng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.108330 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-spkl2"] Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.111745 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts" (OuterVolumeSpecName: "scripts") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.114286 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.115937 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-spkl2"] Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.129035 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "be4a5615-0f46-4d8c-90dc-eb33c3859d3f" (UID: "be4a5615-0f46-4d8c-90dc-eb33c3859d3f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184887 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184920 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184929 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184937 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184945 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.184954 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjng\" (UniqueName: \"kubernetes.io/projected/be4a5615-0f46-4d8c-90dc-eb33c3859d3f-kube-api-access-2jjng\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.845939 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02704a47132b9cabd7645767b75ac15bcd17a753c849e36c5704f31e49104306" Mar 09 16:33:04 crc kubenswrapper[4831]: I0309 16:33:04.846023 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-spkl2" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.241234 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn"] Mar 09 16:33:05 crc kubenswrapper[4831]: E0309 16:33:05.241546 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4a5615-0f46-4d8c-90dc-eb33c3859d3f" containerName="swift-ring-rebalance" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.241559 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4a5615-0f46-4d8c-90dc-eb33c3859d3f" containerName="swift-ring-rebalance" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.241711 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4a5615-0f46-4d8c-90dc-eb33c3859d3f" containerName="swift-ring-rebalance" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.242147 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.244748 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.245086 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.254275 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn"] Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402151 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402263 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402369 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk58v\" (UniqueName: \"kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.402469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503305 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503375 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk58v\" (UniqueName: \"kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503429 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.503701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.504114 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.504259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.507491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.514920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.520422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk58v\" (UniqueName: \"kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v\") pod \"swift-ring-rebalance-debug-zrhvn\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.566002 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.626557 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4a5615-0f46-4d8c-90dc-eb33c3859d3f" path="/var/lib/kubelet/pods/be4a5615-0f46-4d8c-90dc-eb33c3859d3f/volumes" Mar 09 16:33:05 crc kubenswrapper[4831]: I0309 16:33:05.960693 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn"] Mar 09 16:33:05 crc kubenswrapper[4831]: W0309 16:33:05.962170 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03faf89_705e_4816_a840_12a3b6a941ff.slice/crio-898ab972181bad07f663d0ed8b9edbbcdee7728908be6a0fe596f0f6afa263bb WatchSource:0}: Error finding container 898ab972181bad07f663d0ed8b9edbbcdee7728908be6a0fe596f0f6afa263bb: Status 404 returned error can't find the container with id 898ab972181bad07f663d0ed8b9edbbcdee7728908be6a0fe596f0f6afa263bb Mar 09 16:33:06 crc kubenswrapper[4831]: I0309 16:33:06.860313 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" event={"ID":"e03faf89-705e-4816-a840-12a3b6a941ff","Type":"ContainerStarted","Data":"ecf84e5fa39b0052ca4dfa76931bbca51b345845f385302ad3876d537d793f75"} Mar 09 16:33:06 crc kubenswrapper[4831]: I0309 16:33:06.860642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" event={"ID":"e03faf89-705e-4816-a840-12a3b6a941ff","Type":"ContainerStarted","Data":"898ab972181bad07f663d0ed8b9edbbcdee7728908be6a0fe596f0f6afa263bb"} Mar 09 16:33:06 crc kubenswrapper[4831]: I0309 16:33:06.880921 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" podStartSLOduration=1.880898703 podStartE2EDuration="1.880898703s" podCreationTimestamp="2026-03-09 16:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:06.875697125 +0000 UTC m=+2114.009379568" watchObservedRunningTime="2026-03-09 16:33:06.880898703 +0000 UTC m=+2114.014581136" Mar 09 16:33:07 crc kubenswrapper[4831]: I0309 16:33:07.869474 4831 generic.go:334] "Generic (PLEG): container finished" podID="e03faf89-705e-4816-a840-12a3b6a941ff" containerID="ecf84e5fa39b0052ca4dfa76931bbca51b345845f385302ad3876d537d793f75" exitCode=0 Mar 09 16:33:07 crc kubenswrapper[4831]: I0309 16:33:07.869539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" event={"ID":"e03faf89-705e-4816-a840-12a3b6a941ff","Type":"ContainerDied","Data":"ecf84e5fa39b0052ca4dfa76931bbca51b345845f385302ad3876d537d793f75"} Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.162839 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164245 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164302 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk58v\" (UniqueName: \"kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164439 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.164496 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts\") pod \"e03faf89-705e-4816-a840-12a3b6a941ff\" (UID: \"e03faf89-705e-4816-a840-12a3b6a941ff\") " Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.165381 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.165625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.172616 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v" (OuterVolumeSpecName: "kube-api-access-tk58v") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "kube-api-access-tk58v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.187369 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts" (OuterVolumeSpecName: "scripts") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.206382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.211602 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn"] Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.213445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e03faf89-705e-4816-a840-12a3b6a941ff" (UID: "e03faf89-705e-4816-a840-12a3b6a941ff"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.223824 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn"] Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265663 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265699 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk58v\" (UniqueName: \"kubernetes.io/projected/e03faf89-705e-4816-a840-12a3b6a941ff-kube-api-access-tk58v\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265710 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265718 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03faf89-705e-4816-a840-12a3b6a941ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265727 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03faf89-705e-4816-a840-12a3b6a941ff-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.265735 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03faf89-705e-4816-a840-12a3b6a941ff-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.627033 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03faf89-705e-4816-a840-12a3b6a941ff" path="/var/lib/kubelet/pods/e03faf89-705e-4816-a840-12a3b6a941ff/volumes" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.888391 4831 scope.go:117] "RemoveContainer" containerID="ecf84e5fa39b0052ca4dfa76931bbca51b345845f385302ad3876d537d793f75" Mar 09 16:33:09 crc kubenswrapper[4831]: I0309 16:33:09.888481 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrhvn" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.343430 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q6twf"] Mar 09 16:33:10 crc kubenswrapper[4831]: E0309 16:33:10.343720 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03faf89-705e-4816-a840-12a3b6a941ff" containerName="swift-ring-rebalance" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.343731 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03faf89-705e-4816-a840-12a3b6a941ff" containerName="swift-ring-rebalance" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.343875 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03faf89-705e-4816-a840-12a3b6a941ff" containerName="swift-ring-rebalance" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.344343 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.349642 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.349823 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.365705 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q6twf"] Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483187 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483264 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl7qv\" (UniqueName: \"kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.483406 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7qv\" (UniqueName: \"kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584602 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584625 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.584655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.585921 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.586014 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.586306 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.590004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.597917 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.608800 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7qv\" (UniqueName: \"kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv\") pod \"swift-ring-rebalance-debug-q6twf\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:10 crc kubenswrapper[4831]: I0309 16:33:10.665917 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:11 crc kubenswrapper[4831]: I0309 16:33:11.099621 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q6twf"] Mar 09 16:33:11 crc kubenswrapper[4831]: I0309 16:33:11.909017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" event={"ID":"b7623cf5-ffb6-414d-824e-3551a6106940","Type":"ContainerStarted","Data":"b555c2a4360a825c5752c2c1a35cea62cd1a5cbb379bdbd2b9d65525ceca189b"} Mar 09 16:33:11 crc kubenswrapper[4831]: I0309 16:33:11.909541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" event={"ID":"b7623cf5-ffb6-414d-824e-3551a6106940","Type":"ContainerStarted","Data":"251bc341f2e8717a9a90345b7558449d9104937fcc0d986406b82e4f5cb28ff6"} Mar 09 16:33:11 crc kubenswrapper[4831]: I0309 16:33:11.934555 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" podStartSLOduration=1.934537391 podStartE2EDuration="1.934537391s" podCreationTimestamp="2026-03-09 16:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:11.924785604 +0000 UTC m=+2119.058468047" watchObservedRunningTime="2026-03-09 16:33:11.934537391 +0000 UTC m=+2119.068219814" Mar 09 16:33:12 crc kubenswrapper[4831]: I0309 16:33:12.918841 4831 generic.go:334] "Generic (PLEG): container finished" podID="b7623cf5-ffb6-414d-824e-3551a6106940" containerID="b555c2a4360a825c5752c2c1a35cea62cd1a5cbb379bdbd2b9d65525ceca189b" exitCode=0 Mar 09 16:33:12 crc kubenswrapper[4831]: I0309 16:33:12.918897 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" event={"ID":"b7623cf5-ffb6-414d-824e-3551a6106940","Type":"ContainerDied","Data":"b555c2a4360a825c5752c2c1a35cea62cd1a5cbb379bdbd2b9d65525ceca189b"} Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.206769 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.235767 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q6twf"] Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.241621 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q6twf"] Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244248 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244347 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl7qv\" (UniqueName: \"kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244521 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.244549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices\") pod \"b7623cf5-ffb6-414d-824e-3551a6106940\" (UID: \"b7623cf5-ffb6-414d-824e-3551a6106940\") " Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.245123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.245319 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.245325 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.249159 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv" (OuterVolumeSpecName: "kube-api-access-bl7qv") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "kube-api-access-bl7qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.265437 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts" (OuterVolumeSpecName: "scripts") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.266415 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.270376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7623cf5-ffb6-414d-824e-3551a6106940" (UID: "b7623cf5-ffb6-414d-824e-3551a6106940"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.346458 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.346497 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7623cf5-ffb6-414d-824e-3551a6106940-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.346518 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7623cf5-ffb6-414d-824e-3551a6106940-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.346528 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl7qv\" (UniqueName: \"kubernetes.io/projected/b7623cf5-ffb6-414d-824e-3551a6106940-kube-api-access-bl7qv\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.346538 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7623cf5-ffb6-414d-824e-3551a6106940-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.936418 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251bc341f2e8717a9a90345b7558449d9104937fcc0d986406b82e4f5cb28ff6" Mar 09 16:33:14 crc kubenswrapper[4831]: I0309 16:33:14.936520 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q6twf" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.367936 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8xx94"] Mar 09 16:33:15 crc kubenswrapper[4831]: E0309 16:33:15.368279 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7623cf5-ffb6-414d-824e-3551a6106940" containerName="swift-ring-rebalance" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.368294 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7623cf5-ffb6-414d-824e-3551a6106940" containerName="swift-ring-rebalance" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.368517 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7623cf5-ffb6-414d-824e-3551a6106940" containerName="swift-ring-rebalance" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.369084 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.371685 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.371732 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.379666 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8xx94"] Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.460709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.460773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.460804 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.460893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.460978 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.461066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562878 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562905 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562933 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.562973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.563754 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.564032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.564199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.567656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.567828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.580060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv\") pod \"swift-ring-rebalance-debug-8xx94\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.628108 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7623cf5-ffb6-414d-824e-3551a6106940" path="/var/lib/kubelet/pods/b7623cf5-ffb6-414d-824e-3551a6106940/volumes" Mar 09 16:33:15 crc kubenswrapper[4831]: I0309 16:33:15.684329 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:16 crc kubenswrapper[4831]: I0309 16:33:16.132351 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8xx94"] Mar 09 16:33:16 crc kubenswrapper[4831]: I0309 16:33:16.951972 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" event={"ID":"72698d45-dc8b-469c-b938-836c7f1160b5","Type":"ContainerStarted","Data":"610f5016b9464ac517cd12d185e79588a60a95a85cac5154c3c00f735f4391a8"} Mar 09 16:33:16 crc kubenswrapper[4831]: I0309 16:33:16.952302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" event={"ID":"72698d45-dc8b-469c-b938-836c7f1160b5","Type":"ContainerStarted","Data":"802a6b3972d9e188b80c3efcde0a2ecbbe2452aea268107149033b1e197501d1"} Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.540377 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" podStartSLOduration=2.540361556 podStartE2EDuration="2.540361556s" podCreationTimestamp="2026-03-09 16:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:16.972211715 +0000 UTC m=+2124.105894138" watchObservedRunningTime="2026-03-09 16:33:17.540361556 +0000 UTC m=+2124.674043979" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.545579 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.547483 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.570348 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.630196 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49cd9\" (UniqueName: \"kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.630330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.630368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.731783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49cd9\" (UniqueName: \"kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.731841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.731871 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.732679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.733262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.758837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49cd9\" (UniqueName: \"kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9\") pod \"redhat-marketplace-bkqgs\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.868998 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.971847 4831 generic.go:334] "Generic (PLEG): container finished" podID="72698d45-dc8b-469c-b938-836c7f1160b5" containerID="610f5016b9464ac517cd12d185e79588a60a95a85cac5154c3c00f735f4391a8" exitCode=0 Mar 09 16:33:17 crc kubenswrapper[4831]: I0309 16:33:17.971906 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" event={"ID":"72698d45-dc8b-469c-b938-836c7f1160b5","Type":"ContainerDied","Data":"610f5016b9464ac517cd12d185e79588a60a95a85cac5154c3c00f735f4391a8"} Mar 09 16:33:18 crc kubenswrapper[4831]: I0309 16:33:18.147212 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:18 crc kubenswrapper[4831]: I0309 16:33:18.981270 4831 generic.go:334] "Generic (PLEG): container finished" podID="c52359a2-687c-4a06-8703-79a115ef5828" containerID="f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3" exitCode=0 Mar 09 16:33:18 crc kubenswrapper[4831]: I0309 16:33:18.981323 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerDied","Data":"f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3"} Mar 09 16:33:18 crc kubenswrapper[4831]: I0309 16:33:18.981649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerStarted","Data":"e10f0b61d582ac02f023578dfd9004cbafcaf58c730864bb02357fe66ce5ca16"} Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.265937 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.301256 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8xx94"] Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.302374 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8xx94"] Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.455912 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.456538 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.456578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.456689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.456744 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.456850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf\") pod \"72698d45-dc8b-469c-b938-836c7f1160b5\" (UID: \"72698d45-dc8b-469c-b938-836c7f1160b5\") " Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.457703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.457773 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.464880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv" (OuterVolumeSpecName: "kube-api-access-z2shv") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "kube-api-access-z2shv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.482966 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts" (OuterVolumeSpecName: "scripts") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.485596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.485925 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "72698d45-dc8b-469c-b938-836c7f1160b5" (UID: "72698d45-dc8b-469c-b938-836c7f1160b5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558137 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558189 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558201 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72698d45-dc8b-469c-b938-836c7f1160b5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558213 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72698d45-dc8b-469c-b938-836c7f1160b5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558223 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/72698d45-dc8b-469c-b938-836c7f1160b5-kube-api-access-z2shv\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.558234 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72698d45-dc8b-469c-b938-836c7f1160b5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.627609 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72698d45-dc8b-469c-b938-836c7f1160b5" path="/var/lib/kubelet/pods/72698d45-dc8b-469c-b938-836c7f1160b5/volumes" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.991587 4831 scope.go:117] "RemoveContainer" containerID="610f5016b9464ac517cd12d185e79588a60a95a85cac5154c3c00f735f4391a8" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.992588 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8xx94" Mar 09 16:33:19 crc kubenswrapper[4831]: I0309 16:33:19.997946 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerStarted","Data":"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080"} Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.431872 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6"] Mar 09 16:33:20 crc kubenswrapper[4831]: E0309 16:33:20.432849 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72698d45-dc8b-469c-b938-836c7f1160b5" containerName="swift-ring-rebalance" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.432974 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72698d45-dc8b-469c-b938-836c7f1160b5" containerName="swift-ring-rebalance" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.433268 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="72698d45-dc8b-469c-b938-836c7f1160b5" containerName="swift-ring-rebalance" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.435677 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.438911 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.439708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.445672 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6"] Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.577891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.577996 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.578029 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.578056 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qbw\" (UniqueName: \"kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.578103 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.578147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679537 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679668 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qbw\" (UniqueName: \"kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679723 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.679775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.680486 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.680856 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.680981 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.686215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.688599 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.703825 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qbw\" (UniqueName: \"kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw\") pod \"swift-ring-rebalance-debug-cz7j6\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:20 crc kubenswrapper[4831]: I0309 16:33:20.765825 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:21 crc kubenswrapper[4831]: I0309 16:33:21.009193 4831 generic.go:334] "Generic (PLEG): container finished" podID="c52359a2-687c-4a06-8703-79a115ef5828" containerID="9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080" exitCode=0 Mar 09 16:33:21 crc kubenswrapper[4831]: I0309 16:33:21.009237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerDied","Data":"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080"} Mar 09 16:33:21 crc kubenswrapper[4831]: I0309 16:33:21.223288 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6"] Mar 09 16:33:22 crc kubenswrapper[4831]: I0309 16:33:22.017148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerStarted","Data":"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1"} Mar 09 16:33:22 crc kubenswrapper[4831]: I0309 16:33:22.019851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" event={"ID":"e4b5268d-ce0e-4747-87d3-b90f7674a8bd","Type":"ContainerStarted","Data":"050c2d236f0a93bfd000bba6b032d817dfcdef47ef4ab46c04a808b682ab8bde"} Mar 09 16:33:22 crc kubenswrapper[4831]: I0309 16:33:22.019895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" event={"ID":"e4b5268d-ce0e-4747-87d3-b90f7674a8bd","Type":"ContainerStarted","Data":"4500e53105586fce82a3c3ab1dd57d7910ea89e0d4c6d65566a0649d6483ecb4"} Mar 09 16:33:22 crc kubenswrapper[4831]: I0309 16:33:22.036726 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bkqgs" podStartSLOduration=2.444833306 podStartE2EDuration="5.036707882s" podCreationTimestamp="2026-03-09 16:33:17 +0000 UTC" firstStartedPulling="2026-03-09 16:33:18.982934152 +0000 UTC m=+2126.116616575" lastFinishedPulling="2026-03-09 16:33:21.574808728 +0000 UTC m=+2128.708491151" observedRunningTime="2026-03-09 16:33:22.031909825 +0000 UTC m=+2129.165592268" watchObservedRunningTime="2026-03-09 16:33:22.036707882 +0000 UTC m=+2129.170390305" Mar 09 16:33:22 crc kubenswrapper[4831]: I0309 16:33:22.054783 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" podStartSLOduration=2.054766025 podStartE2EDuration="2.054766025s" podCreationTimestamp="2026-03-09 16:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:22.049292009 +0000 UTC m=+2129.182974442" watchObservedRunningTime="2026-03-09 16:33:22.054766025 +0000 UTC m=+2129.188448448" Mar 09 16:33:23 crc kubenswrapper[4831]: I0309 16:33:23.029369 4831 generic.go:334] "Generic (PLEG): container finished" podID="e4b5268d-ce0e-4747-87d3-b90f7674a8bd" containerID="050c2d236f0a93bfd000bba6b032d817dfcdef47ef4ab46c04a808b682ab8bde" exitCode=0 Mar 09 16:33:23 crc kubenswrapper[4831]: I0309 16:33:23.029459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" event={"ID":"e4b5268d-ce0e-4747-87d3-b90f7674a8bd","Type":"ContainerDied","Data":"050c2d236f0a93bfd000bba6b032d817dfcdef47ef4ab46c04a808b682ab8bde"} Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.352821 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.386346 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6"] Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.416546 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6"] Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555162 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555188 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555225 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qbw\" (UniqueName: \"kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.555263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift\") pod \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\" (UID: \"e4b5268d-ce0e-4747-87d3-b90f7674a8bd\") " Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.556069 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.556420 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.565395 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw" (OuterVolumeSpecName: "kube-api-access-d5qbw") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "kube-api-access-d5qbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.575888 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts" (OuterVolumeSpecName: "scripts") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.589472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.597611 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4b5268d-ce0e-4747-87d3-b90f7674a8bd" (UID: "e4b5268d-ce0e-4747-87d3-b90f7674a8bd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656521 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5qbw\" (UniqueName: \"kubernetes.io/projected/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-kube-api-access-d5qbw\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656555 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656565 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656574 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656583 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:24 crc kubenswrapper[4831]: I0309 16:33:24.656592 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4b5268d-ce0e-4747-87d3-b90f7674a8bd-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.044621 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4500e53105586fce82a3c3ab1dd57d7910ea89e0d4c6d65566a0649d6483ecb4" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.044725 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cz7j6" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.520380 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jvss9"] Mar 09 16:33:25 crc kubenswrapper[4831]: E0309 16:33:25.520733 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b5268d-ce0e-4747-87d3-b90f7674a8bd" containerName="swift-ring-rebalance" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.520745 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b5268d-ce0e-4747-87d3-b90f7674a8bd" containerName="swift-ring-rebalance" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.520902 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b5268d-ce0e-4747-87d3-b90f7674a8bd" containerName="swift-ring-rebalance" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.521470 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.523009 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.523321 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.534560 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jvss9"] Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569509 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569718 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569767 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.569891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq84r\" (UniqueName: \"kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.626310 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b5268d-ce0e-4747-87d3-b90f7674a8bd" path="/var/lib/kubelet/pods/e4b5268d-ce0e-4747-87d3-b90f7674a8bd/volumes" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.671658 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.671787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.671821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq84r\" (UniqueName: \"kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.671893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.672046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.672076 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.672715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.672753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.672859 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.681453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.683788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.692446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq84r\" (UniqueName: \"kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r\") pod \"swift-ring-rebalance-debug-jvss9\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:25 crc kubenswrapper[4831]: I0309 16:33:25.864079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:26 crc kubenswrapper[4831]: I0309 16:33:26.282819 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jvss9"] Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.062876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" event={"ID":"3a68b84f-b4d2-4e1c-927f-d18855a2d017","Type":"ContainerStarted","Data":"2f43a8485f65f565159df7ed64772f5c784e73a2071a84d87fad085fbe7d4c0a"} Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.063153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" event={"ID":"3a68b84f-b4d2-4e1c-927f-d18855a2d017","Type":"ContainerStarted","Data":"61adaa5ab8c3407e5c29e60f15c40a0ecba207df4fc14bc8d54b8b9f13f25678"} Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.080645 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" podStartSLOduration=2.080628144 podStartE2EDuration="2.080628144s" podCreationTimestamp="2026-03-09 16:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:27.079636645 +0000 UTC m=+2134.213319068" watchObservedRunningTime="2026-03-09 16:33:27.080628144 +0000 UTC m=+2134.214310577" Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.869993 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.870057 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:27 crc kubenswrapper[4831]: I0309 16:33:27.919757 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:28 crc kubenswrapper[4831]: I0309 16:33:28.073237 4831 generic.go:334] "Generic (PLEG): container finished" podID="3a68b84f-b4d2-4e1c-927f-d18855a2d017" containerID="2f43a8485f65f565159df7ed64772f5c784e73a2071a84d87fad085fbe7d4c0a" exitCode=0 Mar 09 16:33:28 crc kubenswrapper[4831]: I0309 16:33:28.073335 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" event={"ID":"3a68b84f-b4d2-4e1c-927f-d18855a2d017","Type":"ContainerDied","Data":"2f43a8485f65f565159df7ed64772f5c784e73a2071a84d87fad085fbe7d4c0a"} Mar 09 16:33:28 crc kubenswrapper[4831]: I0309 16:33:28.129928 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:28 crc kubenswrapper[4831]: I0309 16:33:28.181930 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.488667 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.517581 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jvss9"] Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.523117 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jvss9"] Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627230 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627575 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq84r\" (UniqueName: \"kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r\") pod \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\" (UID: \"3a68b84f-b4d2-4e1c-927f-d18855a2d017\") " Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.627948 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.628199 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.628198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.632717 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r" (OuterVolumeSpecName: "kube-api-access-kq84r") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "kube-api-access-kq84r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.646250 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts" (OuterVolumeSpecName: "scripts") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.648274 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.648413 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3a68b84f-b4d2-4e1c-927f-d18855a2d017" (UID: "3a68b84f-b4d2-4e1c-927f-d18855a2d017"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.729474 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a68b84f-b4d2-4e1c-927f-d18855a2d017-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.729511 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.729522 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a68b84f-b4d2-4e1c-927f-d18855a2d017-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.729536 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq84r\" (UniqueName: \"kubernetes.io/projected/3a68b84f-b4d2-4e1c-927f-d18855a2d017-kube-api-access-kq84r\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:29 crc kubenswrapper[4831]: I0309 16:33:29.729546 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a68b84f-b4d2-4e1c-927f-d18855a2d017-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.094086 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jvss9" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.094157 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bkqgs" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="registry-server" containerID="cri-o://365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1" gracePeriod=2 Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.094102 4831 scope.go:117] "RemoveContainer" containerID="2f43a8485f65f565159df7ed64772f5c784e73a2071a84d87fad085fbe7d4c0a" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.506778 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.644434 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities\") pod \"c52359a2-687c-4a06-8703-79a115ef5828\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.644824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content\") pod \"c52359a2-687c-4a06-8703-79a115ef5828\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.644874 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49cd9\" (UniqueName: \"kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9\") pod \"c52359a2-687c-4a06-8703-79a115ef5828\" (UID: \"c52359a2-687c-4a06-8703-79a115ef5828\") " Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.645507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities" (OuterVolumeSpecName: "utilities") pod "c52359a2-687c-4a06-8703-79a115ef5828" (UID: "c52359a2-687c-4a06-8703-79a115ef5828"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.650141 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9" (OuterVolumeSpecName: "kube-api-access-49cd9") pod "c52359a2-687c-4a06-8703-79a115ef5828" (UID: "c52359a2-687c-4a06-8703-79a115ef5828"). InnerVolumeSpecName "kube-api-access-49cd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653151 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p"] Mar 09 16:33:30 crc kubenswrapper[4831]: E0309 16:33:30.653511 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="extract-content" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653538 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="extract-content" Mar 09 16:33:30 crc kubenswrapper[4831]: E0309 16:33:30.653554 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="extract-utilities" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653566 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="extract-utilities" Mar 09 16:33:30 crc kubenswrapper[4831]: E0309 16:33:30.653592 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a68b84f-b4d2-4e1c-927f-d18855a2d017" containerName="swift-ring-rebalance" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653604 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a68b84f-b4d2-4e1c-927f-d18855a2d017" containerName="swift-ring-rebalance" Mar 09 16:33:30 crc kubenswrapper[4831]: E0309 16:33:30.653629 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="registry-server" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653638 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="registry-server" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653842 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a68b84f-b4d2-4e1c-927f-d18855a2d017" containerName="swift-ring-rebalance" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.653870 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52359a2-687c-4a06-8703-79a115ef5828" containerName="registry-server" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.654480 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.658032 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.658305 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.676606 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p"] Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.747037 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49cd9\" (UniqueName: \"kubernetes.io/projected/c52359a2-687c-4a06-8703-79a115ef5828-kube-api-access-49cd9\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.747075 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.753859 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c52359a2-687c-4a06-8703-79a115ef5828" (UID: "c52359a2-687c-4a06-8703-79a115ef5828"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848288 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxs6\" (UniqueName: \"kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848532 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.848708 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52359a2-687c-4a06-8703-79a115ef5828-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949552 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949654 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxs6\" (UniqueName: \"kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.949778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.950368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.950885 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.950983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.953375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.953662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:30 crc kubenswrapper[4831]: I0309 16:33:30.966301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxs6\" (UniqueName: \"kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6\") pod \"swift-ring-rebalance-debug-2mg8p\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.017881 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.108217 4831 generic.go:334] "Generic (PLEG): container finished" podID="c52359a2-687c-4a06-8703-79a115ef5828" containerID="365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1" exitCode=0 Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.108287 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkqgs" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.108309 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerDied","Data":"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1"} Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.108371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkqgs" event={"ID":"c52359a2-687c-4a06-8703-79a115ef5828","Type":"ContainerDied","Data":"e10f0b61d582ac02f023578dfd9004cbafcaf58c730864bb02357fe66ce5ca16"} Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.108419 4831 scope.go:117] "RemoveContainer" containerID="365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.140360 4831 scope.go:117] "RemoveContainer" containerID="9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.149682 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.158145 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkqgs"] Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.171194 4831 scope.go:117] "RemoveContainer" containerID="f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.200316 4831 scope.go:117] "RemoveContainer" containerID="365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1" Mar 09 16:33:31 crc kubenswrapper[4831]: E0309 16:33:31.200819 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1\": container with ID starting with 365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1 not found: ID does not exist" containerID="365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.200852 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1"} err="failed to get container status \"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1\": rpc error: code = NotFound desc = could not find container \"365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1\": container with ID starting with 365a31dd9bbf13e523e9f9dffc2785771b98cd093ebde24d33116ea687ab2ed1 not found: ID does not exist" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.200873 4831 scope.go:117] "RemoveContainer" containerID="9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080" Mar 09 16:33:31 crc kubenswrapper[4831]: E0309 16:33:31.201510 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080\": container with ID starting with 9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080 not found: ID does not exist" containerID="9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.201532 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080"} err="failed to get container status \"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080\": rpc error: code = NotFound desc = could not find container \"9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080\": container with ID starting with 9e9725a2ad53386ee1727aa81198fda74ea17c3becd875f337763a34bcbc7080 not found: ID does not exist" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.201546 4831 scope.go:117] "RemoveContainer" containerID="f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3" Mar 09 16:33:31 crc kubenswrapper[4831]: E0309 16:33:31.201737 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3\": container with ID starting with f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3 not found: ID does not exist" containerID="f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.201757 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3"} err="failed to get container status \"f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3\": rpc error: code = NotFound desc = could not find container \"f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3\": container with ID starting with f1f6143c5782317153cd6821b8ffef274c4b878244b7e8f92fff918ef46006d3 not found: ID does not exist" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.471143 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p"] Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.627549 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a68b84f-b4d2-4e1c-927f-d18855a2d017" path="/var/lib/kubelet/pods/3a68b84f-b4d2-4e1c-927f-d18855a2d017/volumes" Mar 09 16:33:31 crc kubenswrapper[4831]: I0309 16:33:31.628691 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52359a2-687c-4a06-8703-79a115ef5828" path="/var/lib/kubelet/pods/c52359a2-687c-4a06-8703-79a115ef5828/volumes" Mar 09 16:33:32 crc kubenswrapper[4831]: I0309 16:33:32.132788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" event={"ID":"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7","Type":"ContainerStarted","Data":"b378cbfeaecd1a93f2fd472dc8240e300d82616d13167ed7330edf8618c2dcd7"} Mar 09 16:33:32 crc kubenswrapper[4831]: I0309 16:33:32.133124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" event={"ID":"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7","Type":"ContainerStarted","Data":"59a2cad50c175147323bbde63073790b88c9cbf72ffecd17e274367f2acfb836"} Mar 09 16:33:32 crc kubenswrapper[4831]: I0309 16:33:32.160086 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" podStartSLOduration=2.160053874 podStartE2EDuration="2.160053874s" podCreationTimestamp="2026-03-09 16:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:32.153873039 +0000 UTC m=+2139.287555472" watchObservedRunningTime="2026-03-09 16:33:32.160053874 +0000 UTC m=+2139.293736297" Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.019252 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.019318 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.019362 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.020083 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.020180 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e" gracePeriod=600 Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.148366 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e" exitCode=0 Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.148463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e"} Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.148540 4831 scope.go:117] "RemoveContainer" containerID="001b6c7b344e8795d498191a7f81074b522134a12291c0a52df3c23bd23348d1" Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.152128 4831 generic.go:334] "Generic (PLEG): container finished" podID="2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" containerID="b378cbfeaecd1a93f2fd472dc8240e300d82616d13167ed7330edf8618c2dcd7" exitCode=0 Mar 09 16:33:33 crc kubenswrapper[4831]: I0309 16:33:33.152976 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" event={"ID":"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7","Type":"ContainerDied","Data":"b378cbfeaecd1a93f2fd472dc8240e300d82616d13167ed7330edf8618c2dcd7"} Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.161535 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5"} Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.466643 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.522032 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p"] Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.531230 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p"] Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628149 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628260 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628346 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628552 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxs6\" (UniqueName: \"kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.628622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts\") pod \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\" (UID: \"2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7\") " Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.629273 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.630052 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.636166 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6" (OuterVolumeSpecName: "kube-api-access-bxxs6") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "kube-api-access-bxxs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.650936 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts" (OuterVolumeSpecName: "scripts") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.656214 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.662494 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" (UID: "2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731052 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731101 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731114 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731124 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731133 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxs6\" (UniqueName: \"kubernetes.io/projected/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-kube-api-access-bxxs6\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:34 crc kubenswrapper[4831]: I0309 16:33:34.731150 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.172908 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59a2cad50c175147323bbde63073790b88c9cbf72ffecd17e274367f2acfb836" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.172929 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mg8p" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.629087 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" path="/var/lib/kubelet/pods/2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7/volumes" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.676628 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh"] Mar 09 16:33:35 crc kubenswrapper[4831]: E0309 16:33:35.676929 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" containerName="swift-ring-rebalance" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.676943 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" containerName="swift-ring-rebalance" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.677078 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b72b7a7-5cd7-4a1e-b8b7-8a2f4dd128e7" containerName="swift-ring-rebalance" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.677545 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.682937 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.682961 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.692811 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh"] Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851266 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.851584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7r4\" (UniqueName: \"kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954075 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954212 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7r4\" (UniqueName: \"kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.954661 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.955579 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.955636 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.963990 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.964534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.975476 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7r4\" (UniqueName: \"kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4\") pod \"swift-ring-rebalance-debug-qjjvh\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:35 crc kubenswrapper[4831]: I0309 16:33:35.994042 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:36 crc kubenswrapper[4831]: I0309 16:33:36.420754 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh"] Mar 09 16:33:36 crc kubenswrapper[4831]: W0309 16:33:36.423812 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26fbb99f_e79c_4b4d_b789_d714c9e77474.slice/crio-cd9b96e1026912d42e314a6630d3493fe7f98458764dc0da12457ea40a1b5c76 WatchSource:0}: Error finding container cd9b96e1026912d42e314a6630d3493fe7f98458764dc0da12457ea40a1b5c76: Status 404 returned error can't find the container with id cd9b96e1026912d42e314a6630d3493fe7f98458764dc0da12457ea40a1b5c76 Mar 09 16:33:37 crc kubenswrapper[4831]: I0309 16:33:37.204814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" event={"ID":"26fbb99f-e79c-4b4d-b789-d714c9e77474","Type":"ContainerStarted","Data":"ecdcf28871355ac8aebbe1f4ddbf37977e4f12412b0c1d1bc72845916a11b51d"} Mar 09 16:33:37 crc kubenswrapper[4831]: I0309 16:33:37.205183 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" event={"ID":"26fbb99f-e79c-4b4d-b789-d714c9e77474","Type":"ContainerStarted","Data":"cd9b96e1026912d42e314a6630d3493fe7f98458764dc0da12457ea40a1b5c76"} Mar 09 16:33:37 crc kubenswrapper[4831]: I0309 16:33:37.231693 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" podStartSLOduration=2.231669523 podStartE2EDuration="2.231669523s" podCreationTimestamp="2026-03-09 16:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:37.224092858 +0000 UTC m=+2144.357775281" watchObservedRunningTime="2026-03-09 16:33:37.231669523 +0000 UTC m=+2144.365351946" Mar 09 16:33:38 crc kubenswrapper[4831]: I0309 16:33:38.213908 4831 generic.go:334] "Generic (PLEG): container finished" podID="26fbb99f-e79c-4b4d-b789-d714c9e77474" containerID="ecdcf28871355ac8aebbe1f4ddbf37977e4f12412b0c1d1bc72845916a11b51d" exitCode=0 Mar 09 16:33:38 crc kubenswrapper[4831]: I0309 16:33:38.214021 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" event={"ID":"26fbb99f-e79c-4b4d-b789-d714c9e77474","Type":"ContainerDied","Data":"ecdcf28871355ac8aebbe1f4ddbf37977e4f12412b0c1d1bc72845916a11b51d"} Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.562356 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.604656 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh"] Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.611289 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh"] Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712768 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712829 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7r4\" (UniqueName: \"kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712934 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.712978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices\") pod \"26fbb99f-e79c-4b4d-b789-d714c9e77474\" (UID: \"26fbb99f-e79c-4b4d-b789-d714c9e77474\") " Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.713705 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.713972 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.718536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4" (OuterVolumeSpecName: "kube-api-access-xn7r4") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "kube-api-access-xn7r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.738901 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts" (OuterVolumeSpecName: "scripts") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.739019 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.739581 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "26fbb99f-e79c-4b4d-b789-d714c9e77474" (UID: "26fbb99f-e79c-4b4d-b789-d714c9e77474"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814513 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814548 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814560 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26fbb99f-e79c-4b4d-b789-d714c9e77474-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814569 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7r4\" (UniqueName: \"kubernetes.io/projected/26fbb99f-e79c-4b4d-b789-d714c9e77474-kube-api-access-xn7r4\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814580 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26fbb99f-e79c-4b4d-b789-d714c9e77474-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:39 crc kubenswrapper[4831]: I0309 16:33:39.814589 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26fbb99f-e79c-4b4d-b789-d714c9e77474-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.229638 4831 scope.go:117] "RemoveContainer" containerID="ecdcf28871355ac8aebbe1f4ddbf37977e4f12412b0c1d1bc72845916a11b51d" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.229701 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qjjvh" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.771766 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk"] Mar 09 16:33:40 crc kubenswrapper[4831]: E0309 16:33:40.772433 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fbb99f-e79c-4b4d-b789-d714c9e77474" containerName="swift-ring-rebalance" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.772449 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fbb99f-e79c-4b4d-b789-d714c9e77474" containerName="swift-ring-rebalance" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.772633 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fbb99f-e79c-4b4d-b789-d714c9e77474" containerName="swift-ring-rebalance" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.773238 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.778758 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.778771 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.784170 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk"] Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929244 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929367 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7fq\" (UniqueName: \"kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929467 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929554 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:40 crc kubenswrapper[4831]: I0309 16:33:40.929610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7fq\" (UniqueName: \"kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030946 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.030970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.031521 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.031786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.032206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.035186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.035356 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.048372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7fq\" (UniqueName: \"kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq\") pod \"swift-ring-rebalance-debug-qtwlk\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.091616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.580528 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk"] Mar 09 16:33:41 crc kubenswrapper[4831]: I0309 16:33:41.634716 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fbb99f-e79c-4b4d-b789-d714c9e77474" path="/var/lib/kubelet/pods/26fbb99f-e79c-4b4d-b789-d714c9e77474/volumes" Mar 09 16:33:42 crc kubenswrapper[4831]: I0309 16:33:42.252368 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" event={"ID":"17a64c14-169a-4bee-ad27-e3f9ea984921","Type":"ContainerStarted","Data":"7c6ad37776926321a3ce461804570fde2c62a843b8f2efd6e2b64e17ea02e6ac"} Mar 09 16:33:42 crc kubenswrapper[4831]: I0309 16:33:42.252929 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" event={"ID":"17a64c14-169a-4bee-ad27-e3f9ea984921","Type":"ContainerStarted","Data":"4526add33f104ef525988577c2f0ce7abbd174528f7afa4b288f7ec4ae543191"} Mar 09 16:33:42 crc kubenswrapper[4831]: I0309 16:33:42.266759 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" podStartSLOduration=2.266742543 podStartE2EDuration="2.266742543s" podCreationTimestamp="2026-03-09 16:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:42.265109087 +0000 UTC m=+2149.398791510" watchObservedRunningTime="2026-03-09 16:33:42.266742543 +0000 UTC m=+2149.400424966" Mar 09 16:33:43 crc kubenswrapper[4831]: I0309 16:33:43.278034 4831 generic.go:334] "Generic (PLEG): container finished" podID="17a64c14-169a-4bee-ad27-e3f9ea984921" containerID="7c6ad37776926321a3ce461804570fde2c62a843b8f2efd6e2b64e17ea02e6ac" exitCode=0 Mar 09 16:33:43 crc kubenswrapper[4831]: I0309 16:33:43.278126 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" event={"ID":"17a64c14-169a-4bee-ad27-e3f9ea984921","Type":"ContainerDied","Data":"7c6ad37776926321a3ce461804570fde2c62a843b8f2efd6e2b64e17ea02e6ac"} Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.564929 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.608786 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk"] Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.623041 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk"] Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691289 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c7fq\" (UniqueName: \"kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691349 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf\") pod \"17a64c14-169a-4bee-ad27-e3f9ea984921\" (UID: \"17a64c14-169a-4bee-ad27-e3f9ea984921\") " Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.691883 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.692466 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.696769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq" (OuterVolumeSpecName: "kube-api-access-9c7fq") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "kube-api-access-9c7fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.714607 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.720552 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.729091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts" (OuterVolumeSpecName: "scripts") pod "17a64c14-169a-4bee-ad27-e3f9ea984921" (UID: "17a64c14-169a-4bee-ad27-e3f9ea984921"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793300 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793339 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a64c14-169a-4bee-ad27-e3f9ea984921-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793350 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c7fq\" (UniqueName: \"kubernetes.io/projected/17a64c14-169a-4bee-ad27-e3f9ea984921-kube-api-access-9c7fq\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793360 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793372 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a64c14-169a-4bee-ad27-e3f9ea984921-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:44 crc kubenswrapper[4831]: I0309 16:33:44.793381 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a64c14-169a-4bee-ad27-e3f9ea984921-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.296492 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4526add33f104ef525988577c2f0ce7abbd174528f7afa4b288f7ec4ae543191" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.296548 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtwlk" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.626678 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a64c14-169a-4bee-ad27-e3f9ea984921" path="/var/lib/kubelet/pods/17a64c14-169a-4bee-ad27-e3f9ea984921/volumes" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.738283 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l824g"] Mar 09 16:33:45 crc kubenswrapper[4831]: E0309 16:33:45.738553 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a64c14-169a-4bee-ad27-e3f9ea984921" containerName="swift-ring-rebalance" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.738564 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a64c14-169a-4bee-ad27-e3f9ea984921" containerName="swift-ring-rebalance" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.738700 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a64c14-169a-4bee-ad27-e3f9ea984921" containerName="swift-ring-rebalance" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.739145 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.742083 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.742506 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.760046 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l824g"] Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909342 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909383 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909600 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:45 crc kubenswrapper[4831]: I0309 16:33:45.909648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmfb\" (UniqueName: \"kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011677 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srmfb\" (UniqueName: \"kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011755 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011810 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.011852 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.012612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.012777 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.013039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.021465 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.041233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.044844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmfb\" (UniqueName: \"kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb\") pod \"swift-ring-rebalance-debug-l824g\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.057243 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:46 crc kubenswrapper[4831]: I0309 16:33:46.514234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l824g"] Mar 09 16:33:46 crc kubenswrapper[4831]: W0309 16:33:46.523331 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16ecb69c_406a_4f65_80cd_53c59f63b583.slice/crio-98f234117ba9d19ad9e86c7924b710c30ba92d09e753f17ff89486d3bedfa1e7 WatchSource:0}: Error finding container 98f234117ba9d19ad9e86c7924b710c30ba92d09e753f17ff89486d3bedfa1e7: Status 404 returned error can't find the container with id 98f234117ba9d19ad9e86c7924b710c30ba92d09e753f17ff89486d3bedfa1e7 Mar 09 16:33:47 crc kubenswrapper[4831]: I0309 16:33:47.314758 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" event={"ID":"16ecb69c-406a-4f65-80cd-53c59f63b583","Type":"ContainerStarted","Data":"dc5bb4e00be4dad591defb87d2e267fae0a485190618ba0f4303104824291901"} Mar 09 16:33:47 crc kubenswrapper[4831]: I0309 16:33:47.315702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" event={"ID":"16ecb69c-406a-4f65-80cd-53c59f63b583","Type":"ContainerStarted","Data":"98f234117ba9d19ad9e86c7924b710c30ba92d09e753f17ff89486d3bedfa1e7"} Mar 09 16:33:47 crc kubenswrapper[4831]: I0309 16:33:47.347562 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" podStartSLOduration=2.347540553 podStartE2EDuration="2.347540553s" podCreationTimestamp="2026-03-09 16:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:47.331562589 +0000 UTC m=+2154.465245012" watchObservedRunningTime="2026-03-09 16:33:47.347540553 +0000 UTC m=+2154.481222966" Mar 09 16:33:48 crc kubenswrapper[4831]: I0309 16:33:48.325111 4831 generic.go:334] "Generic (PLEG): container finished" podID="16ecb69c-406a-4f65-80cd-53c59f63b583" containerID="dc5bb4e00be4dad591defb87d2e267fae0a485190618ba0f4303104824291901" exitCode=0 Mar 09 16:33:48 crc kubenswrapper[4831]: I0309 16:33:48.325882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" event={"ID":"16ecb69c-406a-4f65-80cd-53c59f63b583","Type":"ContainerDied","Data":"dc5bb4e00be4dad591defb87d2e267fae0a485190618ba0f4303104824291901"} Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.607675 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.643228 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l824g"] Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.659319 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l824g"] Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.767343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.767472 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srmfb\" (UniqueName: \"kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768295 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768388 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768385 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768497 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf\") pod \"16ecb69c-406a-4f65-80cd-53c59f63b583\" (UID: \"16ecb69c-406a-4f65-80cd-53c59f63b583\") " Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.768954 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.769279 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.773353 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb" (OuterVolumeSpecName: "kube-api-access-srmfb") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "kube-api-access-srmfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.788902 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts" (OuterVolumeSpecName: "scripts") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.789251 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.790812 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "16ecb69c-406a-4f65-80cd-53c59f63b583" (UID: "16ecb69c-406a-4f65-80cd-53c59f63b583"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.870854 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ecb69c-406a-4f65-80cd-53c59f63b583-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.870886 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/16ecb69c-406a-4f65-80cd-53c59f63b583-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.870897 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.870909 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/16ecb69c-406a-4f65-80cd-53c59f63b583-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:49.870920 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srmfb\" (UniqueName: \"kubernetes.io/projected/16ecb69c-406a-4f65-80cd-53c59f63b583-kube-api-access-srmfb\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.343578 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f234117ba9d19ad9e86c7924b710c30ba92d09e753f17ff89486d3bedfa1e7" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.343753 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l824g" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.852086 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztt26"] Mar 09 16:33:50 crc kubenswrapper[4831]: E0309 16:33:50.852434 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ecb69c-406a-4f65-80cd-53c59f63b583" containerName="swift-ring-rebalance" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.852451 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ecb69c-406a-4f65-80cd-53c59f63b583" containerName="swift-ring-rebalance" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.852662 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ecb69c-406a-4f65-80cd-53c59f63b583" containerName="swift-ring-rebalance" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.853228 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.855539 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.855835 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.864834 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztt26"] Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986290 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986522 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986690 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rss\" (UniqueName: \"kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:50 crc kubenswrapper[4831]: I0309 16:33:50.986734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087535 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087707 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rss\" (UniqueName: \"kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087735 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.087761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.088694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.088702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.088764 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.091895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.091973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.104440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rss\" (UniqueName: \"kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss\") pod \"swift-ring-rebalance-debug-ztt26\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.201592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.629575 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ecb69c-406a-4f65-80cd-53c59f63b583" path="/var/lib/kubelet/pods/16ecb69c-406a-4f65-80cd-53c59f63b583/volumes" Mar 09 16:33:51 crc kubenswrapper[4831]: I0309 16:33:51.633659 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztt26"] Mar 09 16:33:52 crc kubenswrapper[4831]: I0309 16:33:52.361989 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" event={"ID":"7cf98751-a46a-43ee-9321-cfc79b49ed62","Type":"ContainerStarted","Data":"f041a1e2578461b56c74618911f354010bb1f6a65b2cf1cf4b08d2b0bb974567"} Mar 09 16:33:52 crc kubenswrapper[4831]: I0309 16:33:52.363278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" event={"ID":"7cf98751-a46a-43ee-9321-cfc79b49ed62","Type":"ContainerStarted","Data":"9e0a3da7709c4161f816a97baf0f3bfb3d21b23cf411774d922edcdf7a901c40"} Mar 09 16:33:52 crc kubenswrapper[4831]: I0309 16:33:52.376662 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" podStartSLOduration=2.376642933 podStartE2EDuration="2.376642933s" podCreationTimestamp="2026-03-09 16:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:52.375779668 +0000 UTC m=+2159.509462101" watchObservedRunningTime="2026-03-09 16:33:52.376642933 +0000 UTC m=+2159.510325356" Mar 09 16:33:53 crc kubenswrapper[4831]: I0309 16:33:53.370722 4831 generic.go:334] "Generic (PLEG): container finished" podID="7cf98751-a46a-43ee-9321-cfc79b49ed62" containerID="f041a1e2578461b56c74618911f354010bb1f6a65b2cf1cf4b08d2b0bb974567" exitCode=0 Mar 09 16:33:53 crc kubenswrapper[4831]: I0309 16:33:53.370782 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" event={"ID":"7cf98751-a46a-43ee-9321-cfc79b49ed62","Type":"ContainerDied","Data":"f041a1e2578461b56c74618911f354010bb1f6a65b2cf1cf4b08d2b0bb974567"} Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.694153 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.729886 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztt26"] Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.736134 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztt26"] Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.840870 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.840988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.841037 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.841101 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7rss\" (UniqueName: \"kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.841169 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.841212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts\") pod \"7cf98751-a46a-43ee-9321-cfc79b49ed62\" (UID: \"7cf98751-a46a-43ee-9321-cfc79b49ed62\") " Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.841781 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.842087 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.847004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss" (OuterVolumeSpecName: "kube-api-access-q7rss") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "kube-api-access-q7rss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.861793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.868620 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.877703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts" (OuterVolumeSpecName: "scripts") pod "7cf98751-a46a-43ee-9321-cfc79b49ed62" (UID: "7cf98751-a46a-43ee-9321-cfc79b49ed62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942626 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942657 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942668 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7rss\" (UniqueName: \"kubernetes.io/projected/7cf98751-a46a-43ee-9321-cfc79b49ed62-kube-api-access-q7rss\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942680 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7cf98751-a46a-43ee-9321-cfc79b49ed62-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942688 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf98751-a46a-43ee-9321-cfc79b49ed62-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:54 crc kubenswrapper[4831]: I0309 16:33:54.942697 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7cf98751-a46a-43ee-9321-cfc79b49ed62-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.388484 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0a3da7709c4161f816a97baf0f3bfb3d21b23cf411774d922edcdf7a901c40" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.388538 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztt26" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.627998 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf98751-a46a-43ee-9321-cfc79b49ed62" path="/var/lib/kubelet/pods/7cf98751-a46a-43ee-9321-cfc79b49ed62/volumes" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.876927 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx"] Mar 09 16:33:55 crc kubenswrapper[4831]: E0309 16:33:55.877320 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf98751-a46a-43ee-9321-cfc79b49ed62" containerName="swift-ring-rebalance" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.877334 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf98751-a46a-43ee-9321-cfc79b49ed62" containerName="swift-ring-rebalance" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.877548 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf98751-a46a-43ee-9321-cfc79b49ed62" containerName="swift-ring-rebalance" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.878085 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.880661 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.880694 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.888333 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx"] Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957172 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957230 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957263 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6rl\" (UniqueName: \"kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:55 crc kubenswrapper[4831]: I0309 16:33:55.957622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059636 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6rl\" (UniqueName: \"kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.059779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.060566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.060877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.061229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.065221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.068967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.081385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6rl\" (UniqueName: \"kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl\") pod \"swift-ring-rebalance-debug-hbsdx\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.196603 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:56 crc kubenswrapper[4831]: I0309 16:33:56.656999 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx"] Mar 09 16:33:56 crc kubenswrapper[4831]: W0309 16:33:56.660981 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d0f3cf_55e3_44bd_8099_8fb90d8d0f88.slice/crio-ca033f62fc9460e4b3fcb85756a1c74872a0b83004f5f00b3a9b8bed69c74017 WatchSource:0}: Error finding container ca033f62fc9460e4b3fcb85756a1c74872a0b83004f5f00b3a9b8bed69c74017: Status 404 returned error can't find the container with id ca033f62fc9460e4b3fcb85756a1c74872a0b83004f5f00b3a9b8bed69c74017 Mar 09 16:33:57 crc kubenswrapper[4831]: I0309 16:33:57.407785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" event={"ID":"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88","Type":"ContainerStarted","Data":"85bd876b2996bedfbc2b8b48bcd7c3b9ccdbdf427146592695257ad99f2a5a11"} Mar 09 16:33:57 crc kubenswrapper[4831]: I0309 16:33:57.408131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" event={"ID":"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88","Type":"ContainerStarted","Data":"ca033f62fc9460e4b3fcb85756a1c74872a0b83004f5f00b3a9b8bed69c74017"} Mar 09 16:33:57 crc kubenswrapper[4831]: I0309 16:33:57.423219 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" podStartSLOduration=2.42320006 podStartE2EDuration="2.42320006s" podCreationTimestamp="2026-03-09 16:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:33:57.420978077 +0000 UTC m=+2164.554660500" watchObservedRunningTime="2026-03-09 16:33:57.42320006 +0000 UTC m=+2164.556882483" Mar 09 16:33:58 crc kubenswrapper[4831]: I0309 16:33:58.417675 4831 generic.go:334] "Generic (PLEG): container finished" podID="29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" containerID="85bd876b2996bedfbc2b8b48bcd7c3b9ccdbdf427146592695257ad99f2a5a11" exitCode=0 Mar 09 16:33:58 crc kubenswrapper[4831]: I0309 16:33:58.417746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" event={"ID":"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88","Type":"ContainerDied","Data":"85bd876b2996bedfbc2b8b48bcd7c3b9ccdbdf427146592695257ad99f2a5a11"} Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.699033 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.729086 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx"] Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.733477 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx"] Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd6rl\" (UniqueName: \"kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815892 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.815938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf\") pod \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\" (UID: \"29d0f3cf-55e3-44bd-8099-8fb90d8d0f88\") " Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.816478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.816947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.821357 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl" (OuterVolumeSpecName: "kube-api-access-xd6rl") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "kube-api-access-xd6rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.835156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts" (OuterVolumeSpecName: "scripts") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.838356 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.841192 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" (UID: "29d0f3cf-55e3-44bd-8099-8fb90d8d0f88"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.917619 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd6rl\" (UniqueName: \"kubernetes.io/projected/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-kube-api-access-xd6rl\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.917794 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.917876 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.917928 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.918031 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:33:59 crc kubenswrapper[4831]: I0309 16:33:59.918090 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.144507 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551234-nm6sk"] Mar 09 16:34:00 crc kubenswrapper[4831]: E0309 16:34:00.145027 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" containerName="swift-ring-rebalance" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.145203 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" containerName="swift-ring-rebalance" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.145393 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" containerName="swift-ring-rebalance" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.145930 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.148690 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.148773 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.148707 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.157142 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551234-nm6sk"] Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.222097 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgwm\" (UniqueName: \"kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm\") pod \"auto-csr-approver-29551234-nm6sk\" (UID: \"5c4074f3-024f-4d63-a9d7-9fe5893c42d8\") " pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.323684 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgwm\" (UniqueName: \"kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm\") pod \"auto-csr-approver-29551234-nm6sk\" (UID: \"5c4074f3-024f-4d63-a9d7-9fe5893c42d8\") " pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.352076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgwm\" (UniqueName: \"kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm\") pod \"auto-csr-approver-29551234-nm6sk\" (UID: \"5c4074f3-024f-4d63-a9d7-9fe5893c42d8\") " pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.436825 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca033f62fc9460e4b3fcb85756a1c74872a0b83004f5f00b3a9b8bed69c74017" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.436916 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hbsdx" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.467825 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.914550 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t56kj"] Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.915854 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.919512 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.919522 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.935670 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t56kj"] Mar 09 16:34:00 crc kubenswrapper[4831]: I0309 16:34:00.973843 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551234-nm6sk"] Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034060 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034246 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9dk\" (UniqueName: \"kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.034393 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136210 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136233 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136299 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9dk\" (UniqueName: \"kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.136344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.137378 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.137545 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.137689 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.141893 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.142834 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.162167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9dk\" (UniqueName: \"kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk\") pod \"swift-ring-rebalance-debug-t56kj\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.242645 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.447893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" event={"ID":"5c4074f3-024f-4d63-a9d7-9fe5893c42d8","Type":"ContainerStarted","Data":"cc6844015184473a5a8fa42b0268b10dfd796922dbb87f35f0ef5cb790e620c4"} Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.626546 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d0f3cf-55e3-44bd-8099-8fb90d8d0f88" path="/var/lib/kubelet/pods/29d0f3cf-55e3-44bd-8099-8fb90d8d0f88/volumes" Mar 09 16:34:01 crc kubenswrapper[4831]: I0309 16:34:01.663980 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t56kj"] Mar 09 16:34:01 crc kubenswrapper[4831]: W0309 16:34:01.667678 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad61873a_1527_41de_a33e_edb0e3db2fed.slice/crio-01e313bbce78b97d3e6e59b68cc496e8f1a2de29233a0944a2b53badbf408a72 WatchSource:0}: Error finding container 01e313bbce78b97d3e6e59b68cc496e8f1a2de29233a0944a2b53badbf408a72: Status 404 returned error can't find the container with id 01e313bbce78b97d3e6e59b68cc496e8f1a2de29233a0944a2b53badbf408a72 Mar 09 16:34:02 crc kubenswrapper[4831]: I0309 16:34:02.458362 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" event={"ID":"ad61873a-1527-41de-a33e-edb0e3db2fed","Type":"ContainerStarted","Data":"6961fb2f08bddb437a7f685cd7a99515f7d506f8eb6b52b66c2c737f3183c7dd"} Mar 09 16:34:02 crc kubenswrapper[4831]: I0309 16:34:02.459507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" event={"ID":"ad61873a-1527-41de-a33e-edb0e3db2fed","Type":"ContainerStarted","Data":"01e313bbce78b97d3e6e59b68cc496e8f1a2de29233a0944a2b53badbf408a72"} Mar 09 16:34:02 crc kubenswrapper[4831]: I0309 16:34:02.482858 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" podStartSLOduration=2.482836369 podStartE2EDuration="2.482836369s" podCreationTimestamp="2026-03-09 16:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:02.472432973 +0000 UTC m=+2169.606115406" watchObservedRunningTime="2026-03-09 16:34:02.482836369 +0000 UTC m=+2169.616518792" Mar 09 16:34:03 crc kubenswrapper[4831]: I0309 16:34:03.468996 4831 generic.go:334] "Generic (PLEG): container finished" podID="5c4074f3-024f-4d63-a9d7-9fe5893c42d8" containerID="7c42301a71b85bb9ea00a327024cf90b1bbb280b876a6b2c466544d622608294" exitCode=0 Mar 09 16:34:03 crc kubenswrapper[4831]: I0309 16:34:03.469049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" event={"ID":"5c4074f3-024f-4d63-a9d7-9fe5893c42d8","Type":"ContainerDied","Data":"7c42301a71b85bb9ea00a327024cf90b1bbb280b876a6b2c466544d622608294"} Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.477467 4831 generic.go:334] "Generic (PLEG): container finished" podID="ad61873a-1527-41de-a33e-edb0e3db2fed" containerID="6961fb2f08bddb437a7f685cd7a99515f7d506f8eb6b52b66c2c737f3183c7dd" exitCode=0 Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.477559 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" event={"ID":"ad61873a-1527-41de-a33e-edb0e3db2fed","Type":"ContainerDied","Data":"6961fb2f08bddb437a7f685cd7a99515f7d506f8eb6b52b66c2c737f3183c7dd"} Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.764233 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.799862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tgwm\" (UniqueName: \"kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm\") pod \"5c4074f3-024f-4d63-a9d7-9fe5893c42d8\" (UID: \"5c4074f3-024f-4d63-a9d7-9fe5893c42d8\") " Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.817639 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm" (OuterVolumeSpecName: "kube-api-access-7tgwm") pod "5c4074f3-024f-4d63-a9d7-9fe5893c42d8" (UID: "5c4074f3-024f-4d63-a9d7-9fe5893c42d8"). InnerVolumeSpecName "kube-api-access-7tgwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:04 crc kubenswrapper[4831]: I0309 16:34:04.901843 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tgwm\" (UniqueName: \"kubernetes.io/projected/5c4074f3-024f-4d63-a9d7-9fe5893c42d8-kube-api-access-7tgwm\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.489457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" event={"ID":"5c4074f3-024f-4d63-a9d7-9fe5893c42d8","Type":"ContainerDied","Data":"cc6844015184473a5a8fa42b0268b10dfd796922dbb87f35f0ef5cb790e620c4"} Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.489507 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc6844015184473a5a8fa42b0268b10dfd796922dbb87f35f0ef5cb790e620c4" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.489532 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551234-nm6sk" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.770703 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.819275 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.819432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.819483 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.819904 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.819996 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx9dk\" (UniqueName: \"kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.820148 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf\") pod \"ad61873a-1527-41de-a33e-edb0e3db2fed\" (UID: \"ad61873a-1527-41de-a33e-edb0e3db2fed\") " Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.823744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.827585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.832276 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk" (OuterVolumeSpecName: "kube-api-access-tx9dk") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "kube-api-access-tx9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.848805 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts" (OuterVolumeSpecName: "scripts") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.852784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.862134 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t56kj"] Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.863832 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ad61873a-1527-41de-a33e-edb0e3db2fed" (UID: "ad61873a-1527-41de-a33e-edb0e3db2fed"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.868841 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t56kj"] Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.874423 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551228-gs4b5"] Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.879384 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551228-gs4b5"] Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922794 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922839 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx9dk\" (UniqueName: \"kubernetes.io/projected/ad61873a-1527-41de-a33e-edb0e3db2fed-kube-api-access-tx9dk\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922852 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad61873a-1527-41de-a33e-edb0e3db2fed-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922863 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922872 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad61873a-1527-41de-a33e-edb0e3db2fed-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:05 crc kubenswrapper[4831]: I0309 16:34:05.922881 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad61873a-1527-41de-a33e-edb0e3db2fed-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.497706 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01e313bbce78b97d3e6e59b68cc496e8f1a2de29233a0944a2b53badbf408a72" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.497779 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t56kj" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.960750 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw"] Mar 09 16:34:06 crc kubenswrapper[4831]: E0309 16:34:06.961074 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad61873a-1527-41de-a33e-edb0e3db2fed" containerName="swift-ring-rebalance" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.961095 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad61873a-1527-41de-a33e-edb0e3db2fed" containerName="swift-ring-rebalance" Mar 09 16:34:06 crc kubenswrapper[4831]: E0309 16:34:06.961125 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4074f3-024f-4d63-a9d7-9fe5893c42d8" containerName="oc" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.961136 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4074f3-024f-4d63-a9d7-9fe5893c42d8" containerName="oc" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.961295 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad61873a-1527-41de-a33e-edb0e3db2fed" containerName="swift-ring-rebalance" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.961316 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4074f3-024f-4d63-a9d7-9fe5893c42d8" containerName="oc" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.961893 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.963947 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.964248 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:06 crc kubenswrapper[4831]: I0309 16:34:06.973587 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw"] Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038091 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038176 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshfm\" (UniqueName: \"kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038234 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.038429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140353 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.140426 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qshfm\" (UniqueName: \"kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.141018 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.141184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.141262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.145636 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.153826 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.156904 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshfm\" (UniqueName: \"kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm\") pod \"swift-ring-rebalance-debug-rk5bw\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.275711 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.482917 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw"] Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.507211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" event={"ID":"7701c468-ddc5-4a35-88a1-3693545ddc54","Type":"ContainerStarted","Data":"2dbd97c140d625eccfc43238dbe1e2744244bce2f218df4bc3abb1d45b0037e8"} Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.638057 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ca7be6-7949-429b-8aae-074a0b28c000" path="/var/lib/kubelet/pods/20ca7be6-7949-429b-8aae-074a0b28c000/volumes" Mar 09 16:34:07 crc kubenswrapper[4831]: I0309 16:34:07.638944 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad61873a-1527-41de-a33e-edb0e3db2fed" path="/var/lib/kubelet/pods/ad61873a-1527-41de-a33e-edb0e3db2fed/volumes" Mar 09 16:34:08 crc kubenswrapper[4831]: I0309 16:34:08.516456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" event={"ID":"7701c468-ddc5-4a35-88a1-3693545ddc54","Type":"ContainerStarted","Data":"0ada448bbd3493a6a3ac42311856d0208188814374ba658943fd8763db316bcb"} Mar 09 16:34:08 crc kubenswrapper[4831]: I0309 16:34:08.546390 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" podStartSLOduration=2.546372738 podStartE2EDuration="2.546372738s" podCreationTimestamp="2026-03-09 16:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:08.539692629 +0000 UTC m=+2175.673375062" watchObservedRunningTime="2026-03-09 16:34:08.546372738 +0000 UTC m=+2175.680055161" Mar 09 16:34:09 crc kubenswrapper[4831]: I0309 16:34:09.555589 4831 generic.go:334] "Generic (PLEG): container finished" podID="7701c468-ddc5-4a35-88a1-3693545ddc54" containerID="0ada448bbd3493a6a3ac42311856d0208188814374ba658943fd8763db316bcb" exitCode=0 Mar 09 16:34:09 crc kubenswrapper[4831]: I0309 16:34:09.555915 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" event={"ID":"7701c468-ddc5-4a35-88a1-3693545ddc54","Type":"ContainerDied","Data":"0ada448bbd3493a6a3ac42311856d0208188814374ba658943fd8763db316bcb"} Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.860275 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893249 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893304 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893408 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qshfm\" (UniqueName: \"kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893496 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893616 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.893649 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf\") pod \"7701c468-ddc5-4a35-88a1-3693545ddc54\" (UID: \"7701c468-ddc5-4a35-88a1-3693545ddc54\") " Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.894332 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.895378 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.902426 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw"] Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.909125 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw"] Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.913859 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm" (OuterVolumeSpecName: "kube-api-access-qshfm") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "kube-api-access-qshfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.918018 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts" (OuterVolumeSpecName: "scripts") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.921485 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.930685 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7701c468-ddc5-4a35-88a1-3693545ddc54" (UID: "7701c468-ddc5-4a35-88a1-3693545ddc54"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996149 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qshfm\" (UniqueName: \"kubernetes.io/projected/7701c468-ddc5-4a35-88a1-3693545ddc54-kube-api-access-qshfm\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996197 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996209 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7701c468-ddc5-4a35-88a1-3693545ddc54-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996219 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996230 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7701c468-ddc5-4a35-88a1-3693545ddc54-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:10 crc kubenswrapper[4831]: I0309 16:34:10.996237 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7701c468-ddc5-4a35-88a1-3693545ddc54-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:11 crc kubenswrapper[4831]: I0309 16:34:11.572336 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbd97c140d625eccfc43238dbe1e2744244bce2f218df4bc3abb1d45b0037e8" Mar 09 16:34:11 crc kubenswrapper[4831]: I0309 16:34:11.572534 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rk5bw" Mar 09 16:34:11 crc kubenswrapper[4831]: I0309 16:34:11.633112 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7701c468-ddc5-4a35-88a1-3693545ddc54" path="/var/lib/kubelet/pods/7701c468-ddc5-4a35-88a1-3693545ddc54/volumes" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.029199 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-scckp"] Mar 09 16:34:12 crc kubenswrapper[4831]: E0309 16:34:12.029845 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7701c468-ddc5-4a35-88a1-3693545ddc54" containerName="swift-ring-rebalance" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.029861 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7701c468-ddc5-4a35-88a1-3693545ddc54" containerName="swift-ring-rebalance" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.030065 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7701c468-ddc5-4a35-88a1-3693545ddc54" containerName="swift-ring-rebalance" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.030693 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.037165 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.047492 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.049290 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-scckp"] Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113331 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113384 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6882\" (UniqueName: \"kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113507 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.113543 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214885 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214933 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6882\" (UniqueName: \"kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.214968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.215755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.215858 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.216311 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.219494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.220761 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.234972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6882\" (UniqueName: \"kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882\") pod \"swift-ring-rebalance-debug-scckp\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.353878 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:12 crc kubenswrapper[4831]: I0309 16:34:12.785307 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-scckp"] Mar 09 16:34:13 crc kubenswrapper[4831]: I0309 16:34:13.595534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" event={"ID":"86aa3233-17b3-47f4-9c82-e652a47add98","Type":"ContainerStarted","Data":"97f154757044ced4e0a4c573755be3ef0e1f87b3961db4bfa6c1a42cf5a7d6dd"} Mar 09 16:34:13 crc kubenswrapper[4831]: I0309 16:34:13.596103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" event={"ID":"86aa3233-17b3-47f4-9c82-e652a47add98","Type":"ContainerStarted","Data":"dd3f40473f2e8ee45b3f8ac7e4e83b0a2536b362e13a245284bf90bc76a6b7cb"} Mar 09 16:34:13 crc kubenswrapper[4831]: I0309 16:34:13.623737 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" podStartSLOduration=1.623712359 podStartE2EDuration="1.623712359s" podCreationTimestamp="2026-03-09 16:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:13.614054414 +0000 UTC m=+2180.747736917" watchObservedRunningTime="2026-03-09 16:34:13.623712359 +0000 UTC m=+2180.757394812" Mar 09 16:34:14 crc kubenswrapper[4831]: I0309 16:34:14.610525 4831 generic.go:334] "Generic (PLEG): container finished" podID="86aa3233-17b3-47f4-9c82-e652a47add98" containerID="97f154757044ced4e0a4c573755be3ef0e1f87b3961db4bfa6c1a42cf5a7d6dd" exitCode=0 Mar 09 16:34:14 crc kubenswrapper[4831]: I0309 16:34:14.610583 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" event={"ID":"86aa3233-17b3-47f4-9c82-e652a47add98","Type":"ContainerDied","Data":"97f154757044ced4e0a4c573755be3ef0e1f87b3961db4bfa6c1a42cf5a7d6dd"} Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.875599 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.918720 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-scckp"] Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.929990 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-scckp"] Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979136 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979203 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6882\" (UniqueName: \"kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979325 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979408 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.979474 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf\") pod \"86aa3233-17b3-47f4-9c82-e652a47add98\" (UID: \"86aa3233-17b3-47f4-9c82-e652a47add98\") " Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.980107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.980347 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:15 crc kubenswrapper[4831]: I0309 16:34:15.985769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882" (OuterVolumeSpecName: "kube-api-access-k6882") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "kube-api-access-k6882". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.002649 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.002889 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts" (OuterVolumeSpecName: "scripts") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.002975 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "86aa3233-17b3-47f4-9c82-e652a47add98" (UID: "86aa3233-17b3-47f4-9c82-e652a47add98"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081331 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081425 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081445 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/86aa3233-17b3-47f4-9c82-e652a47add98-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081463 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6882\" (UniqueName: \"kubernetes.io/projected/86aa3233-17b3-47f4-9c82-e652a47add98-kube-api-access-k6882\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081475 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/86aa3233-17b3-47f4-9c82-e652a47add98-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.081488 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/86aa3233-17b3-47f4-9c82-e652a47add98-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.630322 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd3f40473f2e8ee45b3f8ac7e4e83b0a2536b362e13a245284bf90bc76a6b7cb" Mar 09 16:34:16 crc kubenswrapper[4831]: I0309 16:34:16.630376 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-scckp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.053865 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp"] Mar 09 16:34:17 crc kubenswrapper[4831]: E0309 16:34:17.054137 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aa3233-17b3-47f4-9c82-e652a47add98" containerName="swift-ring-rebalance" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.054149 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aa3233-17b3-47f4-9c82-e652a47add98" containerName="swift-ring-rebalance" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.054305 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="86aa3233-17b3-47f4-9c82-e652a47add98" containerName="swift-ring-rebalance" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.054805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.057368 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.058173 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.067927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp"] Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.098793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.099126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gng8n\" (UniqueName: \"kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.099230 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.099317 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.099421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.099549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.200978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gng8n\" (UniqueName: \"kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.201279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.201348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.201455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.201627 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.201697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.202516 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.202917 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.203630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.208793 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.209253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.231997 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gng8n\" (UniqueName: \"kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n\") pod \"swift-ring-rebalance-debug-fxhjp\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.372805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.630006 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86aa3233-17b3-47f4-9c82-e652a47add98" path="/var/lib/kubelet/pods/86aa3233-17b3-47f4-9c82-e652a47add98/volumes" Mar 09 16:34:17 crc kubenswrapper[4831]: I0309 16:34:17.890552 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp"] Mar 09 16:34:18 crc kubenswrapper[4831]: I0309 16:34:18.652171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" event={"ID":"2a085f0d-4772-4370-ac93-ea0b081075bc","Type":"ContainerStarted","Data":"9e6678ae4afba2f33596294ae3b315589e9711f22ddbde96c9bb98924176175e"} Mar 09 16:34:18 crc kubenswrapper[4831]: I0309 16:34:18.652665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" event={"ID":"2a085f0d-4772-4370-ac93-ea0b081075bc","Type":"ContainerStarted","Data":"49965aadf29802df2f27a4f75378c92956f43819c26736d4c3b11b1f489e18f3"} Mar 09 16:34:18 crc kubenswrapper[4831]: I0309 16:34:18.675325 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" podStartSLOduration=1.6753039589999998 podStartE2EDuration="1.675303959s" podCreationTimestamp="2026-03-09 16:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:18.671574243 +0000 UTC m=+2185.805256676" watchObservedRunningTime="2026-03-09 16:34:18.675303959 +0000 UTC m=+2185.808986382" Mar 09 16:34:19 crc kubenswrapper[4831]: I0309 16:34:19.663140 4831 generic.go:334] "Generic (PLEG): container finished" podID="2a085f0d-4772-4370-ac93-ea0b081075bc" containerID="9e6678ae4afba2f33596294ae3b315589e9711f22ddbde96c9bb98924176175e" exitCode=0 Mar 09 16:34:19 crc kubenswrapper[4831]: I0309 16:34:19.663231 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" event={"ID":"2a085f0d-4772-4370-ac93-ea0b081075bc","Type":"ContainerDied","Data":"9e6678ae4afba2f33596294ae3b315589e9711f22ddbde96c9bb98924176175e"} Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.017336 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.043980 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp"] Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.060213 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp"] Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.077946 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gng8n\" (UniqueName: \"kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078032 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078116 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078147 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078185 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078232 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf\") pod \"2a085f0d-4772-4370-ac93-ea0b081075bc\" (UID: \"2a085f0d-4772-4370-ac93-ea0b081075bc\") " Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.078623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.079282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.083769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n" (OuterVolumeSpecName: "kube-api-access-gng8n") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "kube-api-access-gng8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.100176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.105584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts" (OuterVolumeSpecName: "scripts") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.107658 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2a085f0d-4772-4370-ac93-ea0b081075bc" (UID: "2a085f0d-4772-4370-ac93-ea0b081075bc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179297 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a085f0d-4772-4370-ac93-ea0b081075bc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179333 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179345 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179353 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a085f0d-4772-4370-ac93-ea0b081075bc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179362 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gng8n\" (UniqueName: \"kubernetes.io/projected/2a085f0d-4772-4370-ac93-ea0b081075bc-kube-api-access-gng8n\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.179372 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a085f0d-4772-4370-ac93-ea0b081075bc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.628021 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a085f0d-4772-4370-ac93-ea0b081075bc" path="/var/lib/kubelet/pods/2a085f0d-4772-4370-ac93-ea0b081075bc/volumes" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.703538 4831 scope.go:117] "RemoveContainer" containerID="9e6678ae4afba2f33596294ae3b315589e9711f22ddbde96c9bb98924176175e" Mar 09 16:34:21 crc kubenswrapper[4831]: I0309 16:34:21.703672 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fxhjp" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.178601 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t7frv"] Mar 09 16:34:22 crc kubenswrapper[4831]: E0309 16:34:22.179257 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a085f0d-4772-4370-ac93-ea0b081075bc" containerName="swift-ring-rebalance" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.179276 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a085f0d-4772-4370-ac93-ea0b081075bc" containerName="swift-ring-rebalance" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.179503 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a085f0d-4772-4370-ac93-ea0b081075bc" containerName="swift-ring-rebalance" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.180170 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.181833 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.183263 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.187434 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t7frv"] Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200531 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkztw\" (UniqueName: \"kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200721 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200750 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.200789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.301927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302029 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkztw\" (UniqueName: \"kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302161 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.302688 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.303630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.303663 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.308229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.316088 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.335089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkztw\" (UniqueName: \"kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw\") pod \"swift-ring-rebalance-debug-t7frv\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.502514 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:22 crc kubenswrapper[4831]: I0309 16:34:22.768449 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t7frv"] Mar 09 16:34:23 crc kubenswrapper[4831]: I0309 16:34:23.729468 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" event={"ID":"4f24ad64-ae88-49d7-96cf-4899f3af0924","Type":"ContainerStarted","Data":"dea59737a72a5cdd827cedd454cc822a00decbad58254f4e770b96a46b47c61e"} Mar 09 16:34:23 crc kubenswrapper[4831]: I0309 16:34:23.729529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" event={"ID":"4f24ad64-ae88-49d7-96cf-4899f3af0924","Type":"ContainerStarted","Data":"588475eee073ccbcaffc9db6d36e3c751bcdcb6ea17ce99d6914f40ccffc64a4"} Mar 09 16:34:23 crc kubenswrapper[4831]: I0309 16:34:23.765137 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" podStartSLOduration=1.765092763 podStartE2EDuration="1.765092763s" podCreationTimestamp="2026-03-09 16:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:23.750826028 +0000 UTC m=+2190.884508461" watchObservedRunningTime="2026-03-09 16:34:23.765092763 +0000 UTC m=+2190.898775196" Mar 09 16:34:24 crc kubenswrapper[4831]: I0309 16:34:24.744682 4831 generic.go:334] "Generic (PLEG): container finished" podID="4f24ad64-ae88-49d7-96cf-4899f3af0924" containerID="dea59737a72a5cdd827cedd454cc822a00decbad58254f4e770b96a46b47c61e" exitCode=0 Mar 09 16:34:24 crc kubenswrapper[4831]: I0309 16:34:24.744754 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" event={"ID":"4f24ad64-ae88-49d7-96cf-4899f3af0924","Type":"ContainerDied","Data":"dea59737a72a5cdd827cedd454cc822a00decbad58254f4e770b96a46b47c61e"} Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.069996 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.120088 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t7frv"] Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.131008 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t7frv"] Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.175925 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.176003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkztw\" (UniqueName: \"kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.176069 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.176130 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.176221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.176298 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf\") pod \"4f24ad64-ae88-49d7-96cf-4899f3af0924\" (UID: \"4f24ad64-ae88-49d7-96cf-4899f3af0924\") " Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.177600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.184672 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw" (OuterVolumeSpecName: "kube-api-access-zkztw") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "kube-api-access-zkztw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.193634 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.244129 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.249301 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.260712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts" (OuterVolumeSpecName: "scripts") pod "4f24ad64-ae88-49d7-96cf-4899f3af0924" (UID: "4f24ad64-ae88-49d7-96cf-4899f3af0924"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277726 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277769 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f24ad64-ae88-49d7-96cf-4899f3af0924-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277786 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkztw\" (UniqueName: \"kubernetes.io/projected/4f24ad64-ae88-49d7-96cf-4899f3af0924-kube-api-access-zkztw\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277796 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277805 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f24ad64-ae88-49d7-96cf-4899f3af0924-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.277815 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f24ad64-ae88-49d7-96cf-4899f3af0924-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.766855 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588475eee073ccbcaffc9db6d36e3c751bcdcb6ea17ce99d6914f40ccffc64a4" Mar 09 16:34:26 crc kubenswrapper[4831]: I0309 16:34:26.766942 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t7frv" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.264032 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4"] Mar 09 16:34:27 crc kubenswrapper[4831]: E0309 16:34:27.264435 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f24ad64-ae88-49d7-96cf-4899f3af0924" containerName="swift-ring-rebalance" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.264453 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f24ad64-ae88-49d7-96cf-4899f3af0924" containerName="swift-ring-rebalance" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.264643 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f24ad64-ae88-49d7-96cf-4899f3af0924" containerName="swift-ring-rebalance" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.265319 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.268150 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.271962 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.281478 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4"] Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393295 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393329 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxwz\" (UniqueName: \"kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393364 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393489 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.393525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495292 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxwz\" (UniqueName: \"kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495374 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.495942 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.496246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.496530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.501000 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.504469 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.513047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxwz\" (UniqueName: \"kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz\") pod \"swift-ring-rebalance-debug-dx4k4\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.587159 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:27 crc kubenswrapper[4831]: I0309 16:34:27.632052 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f24ad64-ae88-49d7-96cf-4899f3af0924" path="/var/lib/kubelet/pods/4f24ad64-ae88-49d7-96cf-4899f3af0924/volumes" Mar 09 16:34:28 crc kubenswrapper[4831]: I0309 16:34:28.010316 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4"] Mar 09 16:34:28 crc kubenswrapper[4831]: I0309 16:34:28.786834 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" event={"ID":"9d6163d8-1085-4d48-b795-cc14d16ce38d","Type":"ContainerStarted","Data":"03c33f870b7f9c7a1fad1eee658ac4d222a3b765cd0fb7109645b6111e4b413b"} Mar 09 16:34:28 crc kubenswrapper[4831]: I0309 16:34:28.787267 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" event={"ID":"9d6163d8-1085-4d48-b795-cc14d16ce38d","Type":"ContainerStarted","Data":"93f5b2af2c1971ba1153d717cbf1447fc25568b9d4980f6eb3bd459af6934874"} Mar 09 16:34:28 crc kubenswrapper[4831]: I0309 16:34:28.804166 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" podStartSLOduration=1.804141276 podStartE2EDuration="1.804141276s" podCreationTimestamp="2026-03-09 16:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:28.801931814 +0000 UTC m=+2195.935614237" watchObservedRunningTime="2026-03-09 16:34:28.804141276 +0000 UTC m=+2195.937823709" Mar 09 16:34:29 crc kubenswrapper[4831]: I0309 16:34:29.812158 4831 generic.go:334] "Generic (PLEG): container finished" podID="9d6163d8-1085-4d48-b795-cc14d16ce38d" containerID="03c33f870b7f9c7a1fad1eee658ac4d222a3b765cd0fb7109645b6111e4b413b" exitCode=0 Mar 09 16:34:29 crc kubenswrapper[4831]: I0309 16:34:29.812353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" event={"ID":"9d6163d8-1085-4d48-b795-cc14d16ce38d","Type":"ContainerDied","Data":"03c33f870b7f9c7a1fad1eee658ac4d222a3b765cd0fb7109645b6111e4b413b"} Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.123179 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.154988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.155055 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.155182 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxwz\" (UniqueName: \"kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.155212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.155250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.155273 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift\") pod \"9d6163d8-1085-4d48-b795-cc14d16ce38d\" (UID: \"9d6163d8-1085-4d48-b795-cc14d16ce38d\") " Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.156423 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.156457 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.182633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz" (OuterVolumeSpecName: "kube-api-access-scxwz") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "kube-api-access-scxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.195077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts" (OuterVolumeSpecName: "scripts") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.200491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.249459 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4"] Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.255844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9d6163d8-1085-4d48-b795-cc14d16ce38d" (UID: "9d6163d8-1085-4d48-b795-cc14d16ce38d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257415 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scxwz\" (UniqueName: \"kubernetes.io/projected/9d6163d8-1085-4d48-b795-cc14d16ce38d-kube-api-access-scxwz\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257486 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257518 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257532 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d6163d8-1085-4d48-b795-cc14d16ce38d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257547 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6163d8-1085-4d48-b795-cc14d16ce38d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.257559 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d6163d8-1085-4d48-b795-cc14d16ce38d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.283871 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4"] Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.628368 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6163d8-1085-4d48-b795-cc14d16ce38d" path="/var/lib/kubelet/pods/9d6163d8-1085-4d48-b795-cc14d16ce38d/volumes" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.832280 4831 scope.go:117] "RemoveContainer" containerID="03c33f870b7f9c7a1fad1eee658ac4d222a3b765cd0fb7109645b6111e4b413b" Mar 09 16:34:31 crc kubenswrapper[4831]: I0309 16:34:31.832373 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dx4k4" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.324565 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-66bcl"] Mar 09 16:34:32 crc kubenswrapper[4831]: E0309 16:34:32.324837 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6163d8-1085-4d48-b795-cc14d16ce38d" containerName="swift-ring-rebalance" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.324850 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6163d8-1085-4d48-b795-cc14d16ce38d" containerName="swift-ring-rebalance" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.325015 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6163d8-1085-4d48-b795-cc14d16ce38d" containerName="swift-ring-rebalance" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.325511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.328110 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.330194 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.344224 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-66bcl"] Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377210 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377270 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.377305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxhs\" (UniqueName: \"kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478487 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxhs\" (UniqueName: \"kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.478648 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.479360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.479387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.479417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.482736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.482880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.497312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxhs\" (UniqueName: \"kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs\") pod \"swift-ring-rebalance-debug-66bcl\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:32 crc kubenswrapper[4831]: I0309 16:34:32.650771 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:33 crc kubenswrapper[4831]: I0309 16:34:33.079626 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-66bcl"] Mar 09 16:34:33 crc kubenswrapper[4831]: W0309 16:34:33.084004 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc5bb0d_0901_4183_81fd_98f86c96b6b2.slice/crio-2e25752ff28d0efa2915ea7b610008d270d3133e0698320dc7e083823fc7335c WatchSource:0}: Error finding container 2e25752ff28d0efa2915ea7b610008d270d3133e0698320dc7e083823fc7335c: Status 404 returned error can't find the container with id 2e25752ff28d0efa2915ea7b610008d270d3133e0698320dc7e083823fc7335c Mar 09 16:34:33 crc kubenswrapper[4831]: I0309 16:34:33.857389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" event={"ID":"4dc5bb0d-0901-4183-81fd-98f86c96b6b2","Type":"ContainerStarted","Data":"e27105abccfd9a9b453ab5054fb08143b5cfdecb74f65f67940d1b4269481aa9"} Mar 09 16:34:33 crc kubenswrapper[4831]: I0309 16:34:33.858000 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" event={"ID":"4dc5bb0d-0901-4183-81fd-98f86c96b6b2","Type":"ContainerStarted","Data":"2e25752ff28d0efa2915ea7b610008d270d3133e0698320dc7e083823fc7335c"} Mar 09 16:34:33 crc kubenswrapper[4831]: I0309 16:34:33.889518 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" podStartSLOduration=1.889488304 podStartE2EDuration="1.889488304s" podCreationTimestamp="2026-03-09 16:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:33.876857426 +0000 UTC m=+2201.010539849" watchObservedRunningTime="2026-03-09 16:34:33.889488304 +0000 UTC m=+2201.023170727" Mar 09 16:34:34 crc kubenswrapper[4831]: I0309 16:34:34.867933 4831 generic.go:334] "Generic (PLEG): container finished" podID="4dc5bb0d-0901-4183-81fd-98f86c96b6b2" containerID="e27105abccfd9a9b453ab5054fb08143b5cfdecb74f65f67940d1b4269481aa9" exitCode=0 Mar 09 16:34:34 crc kubenswrapper[4831]: I0309 16:34:34.868026 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" event={"ID":"4dc5bb0d-0901-4183-81fd-98f86c96b6b2","Type":"ContainerDied","Data":"e27105abccfd9a9b453ab5054fb08143b5cfdecb74f65f67940d1b4269481aa9"} Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.175980 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.217351 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-66bcl"] Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.228282 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-66bcl"] Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.237782 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.238042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.238089 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.238177 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.238197 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.238267 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btxhs\" (UniqueName: \"kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs\") pod \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\" (UID: \"4dc5bb0d-0901-4183-81fd-98f86c96b6b2\") " Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.239764 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.240001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.262695 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts" (OuterVolumeSpecName: "scripts") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.264765 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs" (OuterVolumeSpecName: "kube-api-access-btxhs") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "kube-api-access-btxhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.290021 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.293532 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4dc5bb0d-0901-4183-81fd-98f86c96b6b2" (UID: "4dc5bb0d-0901-4183-81fd-98f86c96b6b2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340432 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340487 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btxhs\" (UniqueName: \"kubernetes.io/projected/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-kube-api-access-btxhs\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340502 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340514 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340526 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.340537 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4dc5bb0d-0901-4183-81fd-98f86c96b6b2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.886561 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e25752ff28d0efa2915ea7b610008d270d3133e0698320dc7e083823fc7335c" Mar 09 16:34:36 crc kubenswrapper[4831]: I0309 16:34:36.886689 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-66bcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.359123 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl"] Mar 09 16:34:37 crc kubenswrapper[4831]: E0309 16:34:37.359729 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc5bb0d-0901-4183-81fd-98f86c96b6b2" containerName="swift-ring-rebalance" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.359744 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc5bb0d-0901-4183-81fd-98f86c96b6b2" containerName="swift-ring-rebalance" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.359913 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc5bb0d-0901-4183-81fd-98f86c96b6b2" containerName="swift-ring-rebalance" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.360394 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.362557 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.363145 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.369612 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl"] Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457230 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjq7s\" (UniqueName: \"kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.457486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558744 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjq7s\" (UniqueName: \"kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558896 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.558928 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.559271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.559671 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.559910 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.562124 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.563977 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.578266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjq7s\" (UniqueName: \"kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s\") pod \"swift-ring-rebalance-debug-zlmcl\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.650014 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc5bb0d-0901-4183-81fd-98f86c96b6b2" path="/var/lib/kubelet/pods/4dc5bb0d-0901-4183-81fd-98f86c96b6b2/volumes" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.677691 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:37 crc kubenswrapper[4831]: I0309 16:34:37.891343 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl"] Mar 09 16:34:37 crc kubenswrapper[4831]: W0309 16:34:37.899992 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda53e7d20_9521_4d8e_9960_ded30870eb0c.slice/crio-fd47c44330b6bc0446fd1015ea7e14e6fe3d51033a0613f41f429ff621ef5815 WatchSource:0}: Error finding container fd47c44330b6bc0446fd1015ea7e14e6fe3d51033a0613f41f429ff621ef5815: Status 404 returned error can't find the container with id fd47c44330b6bc0446fd1015ea7e14e6fe3d51033a0613f41f429ff621ef5815 Mar 09 16:34:38 crc kubenswrapper[4831]: I0309 16:34:38.911272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" event={"ID":"a53e7d20-9521-4d8e-9960-ded30870eb0c","Type":"ContainerStarted","Data":"9a3c3a4b30d95c7f5e63df132fcf287195dc1354c74509bd58d4d8a342f17d7f"} Mar 09 16:34:38 crc kubenswrapper[4831]: I0309 16:34:38.911644 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" event={"ID":"a53e7d20-9521-4d8e-9960-ded30870eb0c","Type":"ContainerStarted","Data":"fd47c44330b6bc0446fd1015ea7e14e6fe3d51033a0613f41f429ff621ef5815"} Mar 09 16:34:38 crc kubenswrapper[4831]: I0309 16:34:38.932862 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" podStartSLOduration=1.932831809 podStartE2EDuration="1.932831809s" podCreationTimestamp="2026-03-09 16:34:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:38.926778387 +0000 UTC m=+2206.060460810" watchObservedRunningTime="2026-03-09 16:34:38.932831809 +0000 UTC m=+2206.066514232" Mar 09 16:34:39 crc kubenswrapper[4831]: I0309 16:34:39.923988 4831 generic.go:334] "Generic (PLEG): container finished" podID="a53e7d20-9521-4d8e-9960-ded30870eb0c" containerID="9a3c3a4b30d95c7f5e63df132fcf287195dc1354c74509bd58d4d8a342f17d7f" exitCode=0 Mar 09 16:34:39 crc kubenswrapper[4831]: I0309 16:34:39.924134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" event={"ID":"a53e7d20-9521-4d8e-9960-ded30870eb0c","Type":"ContainerDied","Data":"9a3c3a4b30d95c7f5e63df132fcf287195dc1354c74509bd58d4d8a342f17d7f"} Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.272425 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.305263 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl"] Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.311706 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl"] Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.416881 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.416988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.417091 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.417142 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjq7s\" (UniqueName: \"kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.417173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.417230 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices\") pod \"a53e7d20-9521-4d8e-9960-ded30870eb0c\" (UID: \"a53e7d20-9521-4d8e-9960-ded30870eb0c\") " Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.418340 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.418426 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.424223 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s" (OuterVolumeSpecName: "kube-api-access-pjq7s") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "kube-api-access-pjq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.439873 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts" (OuterVolumeSpecName: "scripts") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.444253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.450158 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a53e7d20-9521-4d8e-9960-ded30870eb0c" (UID: "a53e7d20-9521-4d8e-9960-ded30870eb0c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.518572 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.519140 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjq7s\" (UniqueName: \"kubernetes.io/projected/a53e7d20-9521-4d8e-9960-ded30870eb0c-kube-api-access-pjq7s\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.519243 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a53e7d20-9521-4d8e-9960-ded30870eb0c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.519304 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.519357 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a53e7d20-9521-4d8e-9960-ded30870eb0c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.519443 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e7d20-9521-4d8e-9960-ded30870eb0c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.627830 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53e7d20-9521-4d8e-9960-ded30870eb0c" path="/var/lib/kubelet/pods/a53e7d20-9521-4d8e-9960-ded30870eb0c/volumes" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.945441 4831 scope.go:117] "RemoveContainer" containerID="9a3c3a4b30d95c7f5e63df132fcf287195dc1354c74509bd58d4d8a342f17d7f" Mar 09 16:34:41 crc kubenswrapper[4831]: I0309 16:34:41.945474 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlmcl" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.442711 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dk52k"] Mar 09 16:34:42 crc kubenswrapper[4831]: E0309 16:34:42.444010 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e7d20-9521-4d8e-9960-ded30870eb0c" containerName="swift-ring-rebalance" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.444094 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e7d20-9521-4d8e-9960-ded30870eb0c" containerName="swift-ring-rebalance" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.444346 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53e7d20-9521-4d8e-9960-ded30870eb0c" containerName="swift-ring-rebalance" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.445044 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.447762 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.447762 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.466643 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dk52k"] Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639183 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-574pf\" (UniqueName: \"kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.639389 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.740915 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.741002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.741030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.741062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.741210 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-574pf\" (UniqueName: \"kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.741286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.742132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.742175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.742699 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.745775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.746335 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.760106 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-574pf\" (UniqueName: \"kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf\") pod \"swift-ring-rebalance-debug-dk52k\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:42 crc kubenswrapper[4831]: I0309 16:34:42.764081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:43 crc kubenswrapper[4831]: I0309 16:34:43.210115 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dk52k"] Mar 09 16:34:43 crc kubenswrapper[4831]: I0309 16:34:43.971955 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" event={"ID":"a3838b46-7315-4dae-839a-ca906573228c","Type":"ContainerStarted","Data":"060da24d46167f525d6cf4a46a136ede24c510fc94ae7f26c9e3957f14998246"} Mar 09 16:34:43 crc kubenswrapper[4831]: I0309 16:34:43.972429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" event={"ID":"a3838b46-7315-4dae-839a-ca906573228c","Type":"ContainerStarted","Data":"8810e982d1dc3ab30ce6ea2c38c964d15516a2c2450e63c3c7afa2dd8056201f"} Mar 09 16:34:44 crc kubenswrapper[4831]: I0309 16:34:44.984382 4831 generic.go:334] "Generic (PLEG): container finished" podID="a3838b46-7315-4dae-839a-ca906573228c" containerID="060da24d46167f525d6cf4a46a136ede24c510fc94ae7f26c9e3957f14998246" exitCode=0 Mar 09 16:34:44 crc kubenswrapper[4831]: I0309 16:34:44.984497 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" event={"ID":"a3838b46-7315-4dae-839a-ca906573228c","Type":"ContainerDied","Data":"060da24d46167f525d6cf4a46a136ede24c510fc94ae7f26c9e3957f14998246"} Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.279579 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.301964 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-574pf\" (UniqueName: \"kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302114 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302209 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302247 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts\") pod \"a3838b46-7315-4dae-839a-ca906573228c\" (UID: \"a3838b46-7315-4dae-839a-ca906573228c\") " Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.302897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.315846 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf" (OuterVolumeSpecName: "kube-api-access-574pf") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "kube-api-access-574pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.327203 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dk52k"] Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.331251 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.332568 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts" (OuterVolumeSpecName: "scripts") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.335989 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a3838b46-7315-4dae-839a-ca906573228c" (UID: "a3838b46-7315-4dae-839a-ca906573228c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.344593 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dk52k"] Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.403729 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.403815 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.403869 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3838b46-7315-4dae-839a-ca906573228c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.404021 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3838b46-7315-4dae-839a-ca906573228c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.404040 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-574pf\" (UniqueName: \"kubernetes.io/projected/a3838b46-7315-4dae-839a-ca906573228c-kube-api-access-574pf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:46 crc kubenswrapper[4831]: I0309 16:34:46.404096 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3838b46-7315-4dae-839a-ca906573228c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:47 crc kubenswrapper[4831]: I0309 16:34:47.002370 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8810e982d1dc3ab30ce6ea2c38c964d15516a2c2450e63c3c7afa2dd8056201f" Mar 09 16:34:47 crc kubenswrapper[4831]: I0309 16:34:47.002484 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dk52k" Mar 09 16:34:47 crc kubenswrapper[4831]: I0309 16:34:47.628338 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3838b46-7315-4dae-839a-ca906573228c" path="/var/lib/kubelet/pods/a3838b46-7315-4dae-839a-ca906573228c/volumes" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.311945 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5"] Mar 09 16:34:48 crc kubenswrapper[4831]: E0309 16:34:48.312206 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3838b46-7315-4dae-839a-ca906573228c" containerName="swift-ring-rebalance" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.312222 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3838b46-7315-4dae-839a-ca906573228c" containerName="swift-ring-rebalance" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.312367 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3838b46-7315-4dae-839a-ca906573228c" containerName="swift-ring-rebalance" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.312836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.314976 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.315840 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.328720 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5"] Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332437 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332516 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.332763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b27v\" (UniqueName: \"kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.436986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.437047 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b27v\" (UniqueName: \"kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.437289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.437352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.437513 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.438223 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.438338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.438432 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.442538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.442582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.443473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.459438 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b27v\" (UniqueName: \"kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v\") pod \"swift-ring-rebalance-debug-bv7j5\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:48 crc kubenswrapper[4831]: I0309 16:34:48.632891 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:49 crc kubenswrapper[4831]: I0309 16:34:49.064792 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5"] Mar 09 16:34:50 crc kubenswrapper[4831]: I0309 16:34:50.026578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" event={"ID":"726badc7-41e3-4488-ae7b-08b970cd0886","Type":"ContainerStarted","Data":"f928b482fbaf1519c924d3442f3129d823c5d3d08ed7bcea131370dc6ed6848c"} Mar 09 16:34:50 crc kubenswrapper[4831]: I0309 16:34:50.027240 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" event={"ID":"726badc7-41e3-4488-ae7b-08b970cd0886","Type":"ContainerStarted","Data":"4b124f5e09eded199bccb273759865cb45e7eb3194567b7faf64636718c47042"} Mar 09 16:34:50 crc kubenswrapper[4831]: I0309 16:34:50.051262 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" podStartSLOduration=2.051237132 podStartE2EDuration="2.051237132s" podCreationTimestamp="2026-03-09 16:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:50.046196599 +0000 UTC m=+2217.179879022" watchObservedRunningTime="2026-03-09 16:34:50.051237132 +0000 UTC m=+2217.184919555" Mar 09 16:34:51 crc kubenswrapper[4831]: I0309 16:34:51.038352 4831 generic.go:334] "Generic (PLEG): container finished" podID="726badc7-41e3-4488-ae7b-08b970cd0886" containerID="f928b482fbaf1519c924d3442f3129d823c5d3d08ed7bcea131370dc6ed6848c" exitCode=0 Mar 09 16:34:51 crc kubenswrapper[4831]: I0309 16:34:51.038446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" event={"ID":"726badc7-41e3-4488-ae7b-08b970cd0886","Type":"ContainerDied","Data":"f928b482fbaf1519c924d3442f3129d823c5d3d08ed7bcea131370dc6ed6848c"} Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.339772 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.379615 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5"] Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.390432 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5"] Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.508749 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.508797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b27v\" (UniqueName: \"kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.508856 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.508932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.509039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.509071 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift\") pod \"726badc7-41e3-4488-ae7b-08b970cd0886\" (UID: \"726badc7-41e3-4488-ae7b-08b970cd0886\") " Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.509562 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.509874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.516585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v" (OuterVolumeSpecName: "kube-api-access-2b27v") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "kube-api-access-2b27v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.527705 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts" (OuterVolumeSpecName: "scripts") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.532567 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.535533 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "726badc7-41e3-4488-ae7b-08b970cd0886" (UID: "726badc7-41e3-4488-ae7b-08b970cd0886"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.610796 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.610847 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/726badc7-41e3-4488-ae7b-08b970cd0886-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.610863 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/726badc7-41e3-4488-ae7b-08b970cd0886-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.611092 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.611139 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b27v\" (UniqueName: \"kubernetes.io/projected/726badc7-41e3-4488-ae7b-08b970cd0886-kube-api-access-2b27v\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:52 crc kubenswrapper[4831]: I0309 16:34:52.611157 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/726badc7-41e3-4488-ae7b-08b970cd0886-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.058475 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b124f5e09eded199bccb273759865cb45e7eb3194567b7faf64636718c47042" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.058545 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bv7j5" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.501348 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7"] Mar 09 16:34:53 crc kubenswrapper[4831]: E0309 16:34:53.501688 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726badc7-41e3-4488-ae7b-08b970cd0886" containerName="swift-ring-rebalance" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.501702 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="726badc7-41e3-4488-ae7b-08b970cd0886" containerName="swift-ring-rebalance" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.501870 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="726badc7-41e3-4488-ae7b-08b970cd0886" containerName="swift-ring-rebalance" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.502388 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.506964 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.507117 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.533895 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.534206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.534268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.534323 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.534365 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.534461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ksg\" (UniqueName: \"kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.535949 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7"] Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.631772 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726badc7-41e3-4488-ae7b-08b970cd0886" path="/var/lib/kubelet/pods/726badc7-41e3-4488-ae7b-08b970cd0886/volumes" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.635691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.636640 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.637102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.637129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.637234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.637311 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.637392 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ksg\" (UniqueName: \"kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.638692 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.639851 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.646820 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.647429 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.647811 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.648379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.654285 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ksg\" (UniqueName: \"kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg\") pod \"swift-ring-rebalance-debug-kl5k7\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:53 crc kubenswrapper[4831]: I0309 16:34:53.828217 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:54 crc kubenswrapper[4831]: I0309 16:34:54.274793 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7"] Mar 09 16:34:55 crc kubenswrapper[4831]: I0309 16:34:55.081563 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" event={"ID":"9d165974-bfa9-4ee6-be87-87ef4c0abc40","Type":"ContainerStarted","Data":"e8d8e9fbadab4d5d20b5eda8dca42cacdda8e144d9340ed1a1d67eee4e6cbb6f"} Mar 09 16:34:55 crc kubenswrapper[4831]: I0309 16:34:55.081627 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" event={"ID":"9d165974-bfa9-4ee6-be87-87ef4c0abc40","Type":"ContainerStarted","Data":"ddcaf0c5e023ac098527fab1e83f6a6ec8cac6e9cfc8fc86b9076f64a2193c19"} Mar 09 16:34:55 crc kubenswrapper[4831]: I0309 16:34:55.102745 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" podStartSLOduration=2.102724179 podStartE2EDuration="2.102724179s" podCreationTimestamp="2026-03-09 16:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:34:55.098243932 +0000 UTC m=+2222.231926355" watchObservedRunningTime="2026-03-09 16:34:55.102724179 +0000 UTC m=+2222.236406612" Mar 09 16:34:56 crc kubenswrapper[4831]: I0309 16:34:56.089851 4831 generic.go:334] "Generic (PLEG): container finished" podID="9d165974-bfa9-4ee6-be87-87ef4c0abc40" containerID="e8d8e9fbadab4d5d20b5eda8dca42cacdda8e144d9340ed1a1d67eee4e6cbb6f" exitCode=0 Mar 09 16:34:56 crc kubenswrapper[4831]: I0309 16:34:56.089923 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" event={"ID":"9d165974-bfa9-4ee6-be87-87ef4c0abc40","Type":"ContainerDied","Data":"e8d8e9fbadab4d5d20b5eda8dca42cacdda8e144d9340ed1a1d67eee4e6cbb6f"} Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.370933 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.402854 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7"] Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.408354 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7"] Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448361 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9ksg\" (UniqueName: \"kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448457 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448545 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.448593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.449365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.449986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.453389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg" (OuterVolumeSpecName: "kube-api-access-p9ksg") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "kube-api-access-p9ksg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:34:57 crc kubenswrapper[4831]: E0309 16:34:57.479872 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf podName:9d165974-bfa9-4ee6-be87-87ef4c0abc40 nodeName:}" failed. No retries permitted until 2026-03-09 16:34:57.979827737 +0000 UTC m=+2225.113510150 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40") : error deleting /var/lib/kubelet/pods/9d165974-bfa9-4ee6-be87-87ef4c0abc40/volume-subpaths: remove /var/lib/kubelet/pods/9d165974-bfa9-4ee6-be87-87ef4c0abc40/volume-subpaths: no such file or directory Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.480101 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts" (OuterVolumeSpecName: "scripts") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.482752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.549736 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d165974-bfa9-4ee6-be87-87ef4c0abc40-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.549775 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.549788 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9ksg\" (UniqueName: \"kubernetes.io/projected/9d165974-bfa9-4ee6-be87-87ef4c0abc40-kube-api-access-p9ksg\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.549806 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d165974-bfa9-4ee6-be87-87ef4c0abc40-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:57 crc kubenswrapper[4831]: I0309 16:34:57.549821 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.054843 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") pod \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\" (UID: \"9d165974-bfa9-4ee6-be87-87ef4c0abc40\") " Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.057953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9d165974-bfa9-4ee6-be87-87ef4c0abc40" (UID: "9d165974-bfa9-4ee6-be87-87ef4c0abc40"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.106989 4831 scope.go:117] "RemoveContainer" containerID="e8d8e9fbadab4d5d20b5eda8dca42cacdda8e144d9340ed1a1d67eee4e6cbb6f" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.107076 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kl5k7" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.156592 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d165974-bfa9-4ee6-be87-87ef4c0abc40-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.535074 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twfws"] Mar 09 16:34:58 crc kubenswrapper[4831]: E0309 16:34:58.535343 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d165974-bfa9-4ee6-be87-87ef4c0abc40" containerName="swift-ring-rebalance" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.535356 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d165974-bfa9-4ee6-be87-87ef4c0abc40" containerName="swift-ring-rebalance" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.535514 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d165974-bfa9-4ee6-be87-87ef4c0abc40" containerName="swift-ring-rebalance" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.535964 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.538941 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.539566 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.552355 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twfws"] Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563490 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw669\" (UniqueName: \"kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563715 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563820 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.563939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.665992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666066 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666176 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw669\" (UniqueName: \"kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.666801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.667520 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.667529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.675298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.680059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.689482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw669\" (UniqueName: \"kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669\") pod \"swift-ring-rebalance-debug-twfws\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:58 crc kubenswrapper[4831]: I0309 16:34:58.854598 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.273102 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twfws"] Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.506595 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.508424 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.522513 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.576409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.576480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.576568 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jks\" (UniqueName: \"kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.628355 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d165974-bfa9-4ee6-be87-87ef4c0abc40" path="/var/lib/kubelet/pods/9d165974-bfa9-4ee6-be87-87ef4c0abc40/volumes" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.677542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.677674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jks\" (UniqueName: \"kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.677738 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.678511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.678529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.708022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jks\" (UniqueName: \"kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks\") pod \"community-operators-kqtzg\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:34:59 crc kubenswrapper[4831]: I0309 16:34:59.825274 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:00 crc kubenswrapper[4831]: I0309 16:35:00.138594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" event={"ID":"ddea67aa-2629-48a3-9cc7-89a87d36ab3a","Type":"ContainerStarted","Data":"15c8e214628985457e57ebcc27b33483f53bf932bceb8284c95b0605853fb2cd"} Mar 09 16:35:00 crc kubenswrapper[4831]: I0309 16:35:00.138647 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" event={"ID":"ddea67aa-2629-48a3-9cc7-89a87d36ab3a","Type":"ContainerStarted","Data":"a7a00e4ec322422163e452461d3a47b1e395fab5c73aeae8044458fbe40306e6"} Mar 09 16:35:00 crc kubenswrapper[4831]: I0309 16:35:00.164299 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" podStartSLOduration=2.1642816910000002 podStartE2EDuration="2.164281691s" podCreationTimestamp="2026-03-09 16:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:00.160663668 +0000 UTC m=+2227.294346091" watchObservedRunningTime="2026-03-09 16:35:00.164281691 +0000 UTC m=+2227.297964124" Mar 09 16:35:00 crc kubenswrapper[4831]: I0309 16:35:00.382804 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.149899 4831 generic.go:334] "Generic (PLEG): container finished" podID="ddea67aa-2629-48a3-9cc7-89a87d36ab3a" containerID="15c8e214628985457e57ebcc27b33483f53bf932bceb8284c95b0605853fb2cd" exitCode=0 Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.149999 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" event={"ID":"ddea67aa-2629-48a3-9cc7-89a87d36ab3a","Type":"ContainerDied","Data":"15c8e214628985457e57ebcc27b33483f53bf932bceb8284c95b0605853fb2cd"} Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.153481 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerID="f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a" exitCode=0 Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.153552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerDied","Data":"f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a"} Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.153618 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerStarted","Data":"5e7bc4bf7878c3374297c0547952ad45873a5ea7eda05f6c4ed9f5f07e6ba044"} Mar 09 16:35:01 crc kubenswrapper[4831]: I0309 16:35:01.155796 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.534174 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.568054 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twfws"] Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.576280 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twfws"] Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.722657 4831 scope.go:117] "RemoveContainer" containerID="d79ad13a6052a98b198ba093f6a7ea885909d266af4642f977b2e31dc23db442" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.726623 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.726958 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw669\" (UniqueName: \"kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727054 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727135 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727153 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf\") pod \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\" (UID: \"ddea67aa-2629-48a3-9cc7-89a87d36ab3a\") " Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.727799 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.732430 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.733207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669" (OuterVolumeSpecName: "kube-api-access-dw669") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "kube-api-access-dw669". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.751319 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.751784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts" (OuterVolumeSpecName: "scripts") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.770527 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ddea67aa-2629-48a3-9cc7-89a87d36ab3a" (UID: "ddea67aa-2629-48a3-9cc7-89a87d36ab3a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.829333 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw669\" (UniqueName: \"kubernetes.io/projected/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-kube-api-access-dw669\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.829409 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.829440 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.829460 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:02 crc kubenswrapper[4831]: I0309 16:35:02.829473 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddea67aa-2629-48a3-9cc7-89a87d36ab3a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.171801 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twfws" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.171801 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7a00e4ec322422163e452461d3a47b1e395fab5c73aeae8044458fbe40306e6" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.175008 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerStarted","Data":"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8"} Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.630143 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddea67aa-2629-48a3-9cc7-89a87d36ab3a" path="/var/lib/kubelet/pods/ddea67aa-2629-48a3-9cc7-89a87d36ab3a/volumes" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.705442 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp"] Mar 09 16:35:03 crc kubenswrapper[4831]: E0309 16:35:03.705810 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddea67aa-2629-48a3-9cc7-89a87d36ab3a" containerName="swift-ring-rebalance" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.705832 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddea67aa-2629-48a3-9cc7-89a87d36ab3a" containerName="swift-ring-rebalance" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.706006 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddea67aa-2629-48a3-9cc7-89a87d36ab3a" containerName="swift-ring-rebalance" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.706506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.708681 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.714963 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp"] Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.716145 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846698 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcw6h\" (UniqueName: \"kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.846920 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcw6h\" (UniqueName: \"kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947911 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.947934 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.948801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.948905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.949280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.952694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.954770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:03 crc kubenswrapper[4831]: I0309 16:35:03.969658 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcw6h\" (UniqueName: \"kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h\") pod \"swift-ring-rebalance-debug-hrzbp\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:04 crc kubenswrapper[4831]: I0309 16:35:04.033062 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:04 crc kubenswrapper[4831]: I0309 16:35:04.189792 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerID="e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8" exitCode=0 Mar 09 16:35:04 crc kubenswrapper[4831]: I0309 16:35:04.190112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerDied","Data":"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8"} Mar 09 16:35:04 crc kubenswrapper[4831]: I0309 16:35:04.466799 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp"] Mar 09 16:35:04 crc kubenswrapper[4831]: W0309 16:35:04.470238 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa54e45_76e8_44d4_ba2b_923f3c579c30.slice/crio-2af247768e06ee7a34ef0e5f615db176d76aeee31e5542dadca5242eb50ce9f8 WatchSource:0}: Error finding container 2af247768e06ee7a34ef0e5f615db176d76aeee31e5542dadca5242eb50ce9f8: Status 404 returned error can't find the container with id 2af247768e06ee7a34ef0e5f615db176d76aeee31e5542dadca5242eb50ce9f8 Mar 09 16:35:05 crc kubenswrapper[4831]: I0309 16:35:05.198893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" event={"ID":"0aa54e45-76e8-44d4-ba2b-923f3c579c30","Type":"ContainerStarted","Data":"2ddba3dbe33dffe31bfd0498d4aaa4c65a5dc5fa3aed2fa366447655ff3ff729"} Mar 09 16:35:05 crc kubenswrapper[4831]: I0309 16:35:05.199196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" event={"ID":"0aa54e45-76e8-44d4-ba2b-923f3c579c30","Type":"ContainerStarted","Data":"2af247768e06ee7a34ef0e5f615db176d76aeee31e5542dadca5242eb50ce9f8"} Mar 09 16:35:05 crc kubenswrapper[4831]: I0309 16:35:05.201814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerStarted","Data":"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88"} Mar 09 16:35:05 crc kubenswrapper[4831]: I0309 16:35:05.220143 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" podStartSLOduration=2.220127622 podStartE2EDuration="2.220127622s" podCreationTimestamp="2026-03-09 16:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:05.215849461 +0000 UTC m=+2232.349531894" watchObservedRunningTime="2026-03-09 16:35:05.220127622 +0000 UTC m=+2232.353810045" Mar 09 16:35:05 crc kubenswrapper[4831]: I0309 16:35:05.235754 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kqtzg" podStartSLOduration=2.807587157 podStartE2EDuration="6.235720885s" podCreationTimestamp="2026-03-09 16:34:59 +0000 UTC" firstStartedPulling="2026-03-09 16:35:01.1553823 +0000 UTC m=+2228.289064723" lastFinishedPulling="2026-03-09 16:35:04.583516028 +0000 UTC m=+2231.717198451" observedRunningTime="2026-03-09 16:35:05.233495002 +0000 UTC m=+2232.367177425" watchObservedRunningTime="2026-03-09 16:35:05.235720885 +0000 UTC m=+2232.369403308" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.076509 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.078338 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.105868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.184330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44bth\" (UniqueName: \"kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.184432 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.184490 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.213106 4831 generic.go:334] "Generic (PLEG): container finished" podID="0aa54e45-76e8-44d4-ba2b-923f3c579c30" containerID="2ddba3dbe33dffe31bfd0498d4aaa4c65a5dc5fa3aed2fa366447655ff3ff729" exitCode=0 Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.213300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" event={"ID":"0aa54e45-76e8-44d4-ba2b-923f3c579c30","Type":"ContainerDied","Data":"2ddba3dbe33dffe31bfd0498d4aaa4c65a5dc5fa3aed2fa366447655ff3ff729"} Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.286144 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.286240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44bth\" (UniqueName: \"kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.286315 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.286934 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.287229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.315664 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44bth\" (UniqueName: \"kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth\") pod \"redhat-operators-25flg\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.394292 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:06 crc kubenswrapper[4831]: I0309 16:35:06.878392 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:06 crc kubenswrapper[4831]: W0309 16:35:06.882239 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd03930f_0dc4_4c0c_aa34_7ee061af0cf6.slice/crio-6a207c9e5b672919e27fc1555b40be3ac0b1abf71fe2ed23f5ba72ecc4d36c7f WatchSource:0}: Error finding container 6a207c9e5b672919e27fc1555b40be3ac0b1abf71fe2ed23f5ba72ecc4d36c7f: Status 404 returned error can't find the container with id 6a207c9e5b672919e27fc1555b40be3ac0b1abf71fe2ed23f5ba72ecc4d36c7f Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.222820 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerID="25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c" exitCode=0 Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.222939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerDied","Data":"25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c"} Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.223154 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerStarted","Data":"6a207c9e5b672919e27fc1555b40be3ac0b1abf71fe2ed23f5ba72ecc4d36c7f"} Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.638982 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.669808 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp"] Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.675355 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp"] Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835028 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835354 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835371 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835496 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcw6h\" (UniqueName: \"kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.835638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift\") pod \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\" (UID: \"0aa54e45-76e8-44d4-ba2b-923f3c579c30\") " Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.836629 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.837793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.841389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h" (OuterVolumeSpecName: "kube-api-access-tcw6h") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "kube-api-access-tcw6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.860600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts" (OuterVolumeSpecName: "scripts") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.875149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.889152 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0aa54e45-76e8-44d4-ba2b-923f3c579c30" (UID: "0aa54e45-76e8-44d4-ba2b-923f3c579c30"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937817 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937856 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa54e45-76e8-44d4-ba2b-923f3c579c30-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937867 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937880 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcw6h\" (UniqueName: \"kubernetes.io/projected/0aa54e45-76e8-44d4-ba2b-923f3c579c30-kube-api-access-tcw6h\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937893 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0aa54e45-76e8-44d4-ba2b-923f3c579c30-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:07 crc kubenswrapper[4831]: I0309 16:35:07.937902 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0aa54e45-76e8-44d4-ba2b-923f3c579c30-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.232587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerStarted","Data":"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a"} Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.234105 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af247768e06ee7a34ef0e5f615db176d76aeee31e5542dadca5242eb50ce9f8" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.234125 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrzbp" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.796293 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8knzx"] Mar 09 16:35:08 crc kubenswrapper[4831]: E0309 16:35:08.796890 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa54e45-76e8-44d4-ba2b-923f3c579c30" containerName="swift-ring-rebalance" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.796910 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa54e45-76e8-44d4-ba2b-923f3c579c30" containerName="swift-ring-rebalance" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.797126 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa54e45-76e8-44d4-ba2b-923f3c579c30" containerName="swift-ring-rebalance" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.797754 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.800124 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.800665 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.807447 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8knzx"] Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955055 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955106 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6hxx\" (UniqueName: \"kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:08 crc kubenswrapper[4831]: I0309 16:35:08.955501 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057563 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057646 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6hxx\" (UniqueName: \"kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057729 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.057942 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.058055 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.058745 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.058999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.064977 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.070662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.073591 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6hxx\" (UniqueName: \"kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx\") pod \"swift-ring-rebalance-debug-8knzx\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.115742 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.539429 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8knzx"] Mar 09 16:35:09 crc kubenswrapper[4831]: W0309 16:35:09.547633 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf1cc191_1420_4b0b_8118_d6a6a843d31d.slice/crio-fe13fa194b0fb685183b02795695aca7e5d62dd0644d963e8fc17e5ce3a50d9b WatchSource:0}: Error finding container fe13fa194b0fb685183b02795695aca7e5d62dd0644d963e8fc17e5ce3a50d9b: Status 404 returned error can't find the container with id fe13fa194b0fb685183b02795695aca7e5d62dd0644d963e8fc17e5ce3a50d9b Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.636480 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa54e45-76e8-44d4-ba2b-923f3c579c30" path="/var/lib/kubelet/pods/0aa54e45-76e8-44d4-ba2b-923f3c579c30/volumes" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.826111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.826174 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:09 crc kubenswrapper[4831]: I0309 16:35:09.871959 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.251188 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerID="10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a" exitCode=0 Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.251260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerDied","Data":"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a"} Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.253589 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" event={"ID":"bf1cc191-1420-4b0b-8118-d6a6a843d31d","Type":"ContainerStarted","Data":"868e719094c890095004e9571bcad4931f5cb42ff0bc81974ff98394fb1862b4"} Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.253636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" event={"ID":"bf1cc191-1420-4b0b-8118-d6a6a843d31d","Type":"ContainerStarted","Data":"fe13fa194b0fb685183b02795695aca7e5d62dd0644d963e8fc17e5ce3a50d9b"} Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.298716 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" podStartSLOduration=2.298697239 podStartE2EDuration="2.298697239s" podCreationTimestamp="2026-03-09 16:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:10.295026285 +0000 UTC m=+2237.428708718" watchObservedRunningTime="2026-03-09 16:35:10.298697239 +0000 UTC m=+2237.432379662" Mar 09 16:35:10 crc kubenswrapper[4831]: I0309 16:35:10.310372 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:10 crc kubenswrapper[4831]: E0309 16:35:10.433751 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd03930f_0dc4_4c0c_aa34_7ee061af0cf6.slice/crio-conmon-10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a.scope\": RecentStats: unable to find data in memory cache]" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.270245 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.278102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerStarted","Data":"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e"} Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.280261 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf1cc191-1420-4b0b-8118-d6a6a843d31d" containerID="868e719094c890095004e9571bcad4931f5cb42ff0bc81974ff98394fb1862b4" exitCode=0 Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.280446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" event={"ID":"bf1cc191-1420-4b0b-8118-d6a6a843d31d","Type":"ContainerDied","Data":"868e719094c890095004e9571bcad4931f5cb42ff0bc81974ff98394fb1862b4"} Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.280591 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kqtzg" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="registry-server" containerID="cri-o://94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88" gracePeriod=2 Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.308755 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-25flg" podStartSLOduration=2.280953471 podStartE2EDuration="6.308735985s" podCreationTimestamp="2026-03-09 16:35:06 +0000 UTC" firstStartedPulling="2026-03-09 16:35:07.224649752 +0000 UTC m=+2234.358332175" lastFinishedPulling="2026-03-09 16:35:11.252432266 +0000 UTC m=+2238.386114689" observedRunningTime="2026-03-09 16:35:12.308430677 +0000 UTC m=+2239.442113100" watchObservedRunningTime="2026-03-09 16:35:12.308735985 +0000 UTC m=+2239.442418408" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.713232 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.718479 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities\") pod \"e2e1af67-df62-4df9-963e-c1b5f99de49a\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.718575 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8jks\" (UniqueName: \"kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks\") pod \"e2e1af67-df62-4df9-963e-c1b5f99de49a\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.718597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content\") pod \"e2e1af67-df62-4df9-963e-c1b5f99de49a\" (UID: \"e2e1af67-df62-4df9-963e-c1b5f99de49a\") " Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.720936 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities" (OuterVolumeSpecName: "utilities") pod "e2e1af67-df62-4df9-963e-c1b5f99de49a" (UID: "e2e1af67-df62-4df9-963e-c1b5f99de49a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.724290 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks" (OuterVolumeSpecName: "kube-api-access-q8jks") pod "e2e1af67-df62-4df9-963e-c1b5f99de49a" (UID: "e2e1af67-df62-4df9-963e-c1b5f99de49a"). InnerVolumeSpecName "kube-api-access-q8jks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.774195 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2e1af67-df62-4df9-963e-c1b5f99de49a" (UID: "e2e1af67-df62-4df9-963e-c1b5f99de49a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.819877 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.819914 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8jks\" (UniqueName: \"kubernetes.io/projected/e2e1af67-df62-4df9-963e-c1b5f99de49a-kube-api-access-q8jks\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:12 crc kubenswrapper[4831]: I0309 16:35:12.819925 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e1af67-df62-4df9-963e-c1b5f99de49a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.289093 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerID="94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88" exitCode=0 Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.289145 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqtzg" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.289175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerDied","Data":"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88"} Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.289568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqtzg" event={"ID":"e2e1af67-df62-4df9-963e-c1b5f99de49a","Type":"ContainerDied","Data":"5e7bc4bf7878c3374297c0547952ad45873a5ea7eda05f6c4ed9f5f07e6ba044"} Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.289602 4831 scope.go:117] "RemoveContainer" containerID="94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.324807 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.332202 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kqtzg"] Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.334547 4831 scope.go:117] "RemoveContainer" containerID="e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.363275 4831 scope.go:117] "RemoveContainer" containerID="f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.403644 4831 scope.go:117] "RemoveContainer" containerID="94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88" Mar 09 16:35:13 crc kubenswrapper[4831]: E0309 16:35:13.405917 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88\": container with ID starting with 94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88 not found: ID does not exist" containerID="94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.405968 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88"} err="failed to get container status \"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88\": rpc error: code = NotFound desc = could not find container \"94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88\": container with ID starting with 94b847fcb9c4422b4df7ae55dee79f2b61e9f1b3ec9c02958ed5c6f1866cea88 not found: ID does not exist" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.406001 4831 scope.go:117] "RemoveContainer" containerID="e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8" Mar 09 16:35:13 crc kubenswrapper[4831]: E0309 16:35:13.407579 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8\": container with ID starting with e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8 not found: ID does not exist" containerID="e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.407618 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8"} err="failed to get container status \"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8\": rpc error: code = NotFound desc = could not find container \"e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8\": container with ID starting with e0930cb10baae3920555460351583a02c3fade6d6b8aa7dece2d23fe96af77d8 not found: ID does not exist" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.407645 4831 scope.go:117] "RemoveContainer" containerID="f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a" Mar 09 16:35:13 crc kubenswrapper[4831]: E0309 16:35:13.409814 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a\": container with ID starting with f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a not found: ID does not exist" containerID="f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.409832 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a"} err="failed to get container status \"f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a\": rpc error: code = NotFound desc = could not find container \"f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a\": container with ID starting with f90e15cac09e0ccca2a6fed735037619b5409204f3284c9bd7eecd76a8bf945a not found: ID does not exist" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.603157 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.656997 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" path="/var/lib/kubelet/pods/e2e1af67-df62-4df9-963e-c1b5f99de49a/volumes" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.666102 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8knzx"] Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.679527 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8knzx"] Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.754116 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.754178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6hxx\" (UniqueName: \"kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.754210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.754291 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.754329 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.755000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices\") pod \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\" (UID: \"bf1cc191-1420-4b0b-8118-d6a6a843d31d\") " Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.755249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.755290 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.759419 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx" (OuterVolumeSpecName: "kube-api-access-f6hxx") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "kube-api-access-f6hxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.775027 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts" (OuterVolumeSpecName: "scripts") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.786959 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.799230 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bf1cc191-1420-4b0b-8118-d6a6a843d31d" (UID: "bf1cc191-1420-4b0b-8118-d6a6a843d31d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856129 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856165 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6hxx\" (UniqueName: \"kubernetes.io/projected/bf1cc191-1420-4b0b-8118-d6a6a843d31d-kube-api-access-f6hxx\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856177 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf1cc191-1420-4b0b-8118-d6a6a843d31d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856187 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856195 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf1cc191-1420-4b0b-8118-d6a6a843d31d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:13 crc kubenswrapper[4831]: I0309 16:35:13.856204 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf1cc191-1420-4b0b-8118-d6a6a843d31d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.301845 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe13fa194b0fb685183b02795695aca7e5d62dd0644d963e8fc17e5ce3a50d9b" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.301908 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8knzx" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.768524 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-smdcg"] Mar 09 16:35:14 crc kubenswrapper[4831]: E0309 16:35:14.768959 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="extract-utilities" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.768980 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="extract-utilities" Mar 09 16:35:14 crc kubenswrapper[4831]: E0309 16:35:14.768991 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="registry-server" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769000 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="registry-server" Mar 09 16:35:14 crc kubenswrapper[4831]: E0309 16:35:14.769027 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1cc191-1420-4b0b-8118-d6a6a843d31d" containerName="swift-ring-rebalance" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769033 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1cc191-1420-4b0b-8118-d6a6a843d31d" containerName="swift-ring-rebalance" Mar 09 16:35:14 crc kubenswrapper[4831]: E0309 16:35:14.769055 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="extract-content" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769061 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="extract-content" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769209 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e1af67-df62-4df9-963e-c1b5f99de49a" containerName="registry-server" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769247 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1cc191-1420-4b0b-8118-d6a6a843d31d" containerName="swift-ring-rebalance" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.769845 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.772302 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.772383 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.778755 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-smdcg"] Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870436 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870505 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870541 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870568 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870615 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27bp\" (UniqueName: \"kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.870645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m27bp\" (UniqueName: \"kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972602 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.972721 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.973215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.974084 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.974693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.977113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.978482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:14 crc kubenswrapper[4831]: I0309 16:35:14.996562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27bp\" (UniqueName: \"kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp\") pod \"swift-ring-rebalance-debug-smdcg\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:15 crc kubenswrapper[4831]: I0309 16:35:15.123385 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:15 crc kubenswrapper[4831]: I0309 16:35:15.576951 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-smdcg"] Mar 09 16:35:15 crc kubenswrapper[4831]: W0309 16:35:15.579486 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70acfbdd_6f6a_4627_baff_78a92da8d583.slice/crio-5364ca46cf440058a3c4d2889d9865edd90d201c027bea2670d7c93373031fcb WatchSource:0}: Error finding container 5364ca46cf440058a3c4d2889d9865edd90d201c027bea2670d7c93373031fcb: Status 404 returned error can't find the container with id 5364ca46cf440058a3c4d2889d9865edd90d201c027bea2670d7c93373031fcb Mar 09 16:35:15 crc kubenswrapper[4831]: I0309 16:35:15.636323 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1cc191-1420-4b0b-8118-d6a6a843d31d" path="/var/lib/kubelet/pods/bf1cc191-1420-4b0b-8118-d6a6a843d31d/volumes" Mar 09 16:35:16 crc kubenswrapper[4831]: I0309 16:35:16.324201 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" event={"ID":"70acfbdd-6f6a-4627-baff-78a92da8d583","Type":"ContainerStarted","Data":"6d81775386d2a78cb830e09f3729c8e3e57d5864452c77720cc872936e756149"} Mar 09 16:35:16 crc kubenswrapper[4831]: I0309 16:35:16.324243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" event={"ID":"70acfbdd-6f6a-4627-baff-78a92da8d583","Type":"ContainerStarted","Data":"5364ca46cf440058a3c4d2889d9865edd90d201c027bea2670d7c93373031fcb"} Mar 09 16:35:16 crc kubenswrapper[4831]: I0309 16:35:16.348391 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" podStartSLOduration=2.348371685 podStartE2EDuration="2.348371685s" podCreationTimestamp="2026-03-09 16:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:16.341980204 +0000 UTC m=+2243.475662647" watchObservedRunningTime="2026-03-09 16:35:16.348371685 +0000 UTC m=+2243.482054108" Mar 09 16:35:16 crc kubenswrapper[4831]: I0309 16:35:16.395660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:16 crc kubenswrapper[4831]: I0309 16:35:16.396605 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:17 crc kubenswrapper[4831]: I0309 16:35:17.334462 4831 generic.go:334] "Generic (PLEG): container finished" podID="70acfbdd-6f6a-4627-baff-78a92da8d583" containerID="6d81775386d2a78cb830e09f3729c8e3e57d5864452c77720cc872936e756149" exitCode=0 Mar 09 16:35:17 crc kubenswrapper[4831]: I0309 16:35:17.334514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" event={"ID":"70acfbdd-6f6a-4627-baff-78a92da8d583","Type":"ContainerDied","Data":"6d81775386d2a78cb830e09f3729c8e3e57d5864452c77720cc872936e756149"} Mar 09 16:35:17 crc kubenswrapper[4831]: I0309 16:35:17.447749 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-25flg" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="registry-server" probeResult="failure" output=< Mar 09 16:35:17 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:35:17 crc kubenswrapper[4831]: > Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.630515 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.711482 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-smdcg"] Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.715509 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-smdcg"] Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728510 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728574 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728648 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m27bp\" (UniqueName: \"kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728782 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.728813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf\") pod \"70acfbdd-6f6a-4627-baff-78a92da8d583\" (UID: \"70acfbdd-6f6a-4627-baff-78a92da8d583\") " Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.730348 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.731697 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.739711 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp" (OuterVolumeSpecName: "kube-api-access-m27bp") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "kube-api-access-m27bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.755121 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts" (OuterVolumeSpecName: "scripts") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.757775 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.773376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "70acfbdd-6f6a-4627-baff-78a92da8d583" (UID: "70acfbdd-6f6a-4627-baff-78a92da8d583"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.830920 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70acfbdd-6f6a-4627-baff-78a92da8d583-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.830978 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.830990 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.831001 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m27bp\" (UniqueName: \"kubernetes.io/projected/70acfbdd-6f6a-4627-baff-78a92da8d583-kube-api-access-m27bp\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.831016 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70acfbdd-6f6a-4627-baff-78a92da8d583-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:18 crc kubenswrapper[4831]: I0309 16:35:18.831028 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70acfbdd-6f6a-4627-baff-78a92da8d583-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.351326 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5364ca46cf440058a3c4d2889d9865edd90d201c027bea2670d7c93373031fcb" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.351655 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-smdcg" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.629341 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70acfbdd-6f6a-4627-baff-78a92da8d583" path="/var/lib/kubelet/pods/70acfbdd-6f6a-4627-baff-78a92da8d583/volumes" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.852949 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb"] Mar 09 16:35:19 crc kubenswrapper[4831]: E0309 16:35:19.853712 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70acfbdd-6f6a-4627-baff-78a92da8d583" containerName="swift-ring-rebalance" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.853726 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="70acfbdd-6f6a-4627-baff-78a92da8d583" containerName="swift-ring-rebalance" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.853885 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="70acfbdd-6f6a-4627-baff-78a92da8d583" containerName="swift-ring-rebalance" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.854448 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.856958 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.860276 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:19 crc kubenswrapper[4831]: I0309 16:35:19.865260 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb"] Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049364 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049477 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrzt\" (UniqueName: \"kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.049627 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.150958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.151016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.151085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.151108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrzt\" (UniqueName: \"kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.151152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.151217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.152036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.152180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.152497 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.155387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.155452 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.173598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrzt\" (UniqueName: \"kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt\") pod \"swift-ring-rebalance-debug-zp4sb\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.470816 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:20 crc kubenswrapper[4831]: I0309 16:35:20.983521 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb"] Mar 09 16:35:20 crc kubenswrapper[4831]: W0309 16:35:20.989330 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2da7f8be_0986_43a3_b7c9_8a98d20b2907.slice/crio-33a1566cea5548a794b19d72b43b93bdd6280f130ccf7849c9e544f14e470526 WatchSource:0}: Error finding container 33a1566cea5548a794b19d72b43b93bdd6280f130ccf7849c9e544f14e470526: Status 404 returned error can't find the container with id 33a1566cea5548a794b19d72b43b93bdd6280f130ccf7849c9e544f14e470526 Mar 09 16:35:21 crc kubenswrapper[4831]: I0309 16:35:21.372728 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" event={"ID":"2da7f8be-0986-43a3-b7c9-8a98d20b2907","Type":"ContainerStarted","Data":"e5b63fb8a0daf0b9b267bc0f075dde3ca0f45befb92dff8d6d0380c9a16f1665"} Mar 09 16:35:21 crc kubenswrapper[4831]: I0309 16:35:21.373044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" event={"ID":"2da7f8be-0986-43a3-b7c9-8a98d20b2907","Type":"ContainerStarted","Data":"33a1566cea5548a794b19d72b43b93bdd6280f130ccf7849c9e544f14e470526"} Mar 09 16:35:21 crc kubenswrapper[4831]: I0309 16:35:21.389590 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" podStartSLOduration=2.389570429 podStartE2EDuration="2.389570429s" podCreationTimestamp="2026-03-09 16:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:21.386711838 +0000 UTC m=+2248.520394271" watchObservedRunningTime="2026-03-09 16:35:21.389570429 +0000 UTC m=+2248.523252852" Mar 09 16:35:22 crc kubenswrapper[4831]: I0309 16:35:22.381877 4831 generic.go:334] "Generic (PLEG): container finished" podID="2da7f8be-0986-43a3-b7c9-8a98d20b2907" containerID="e5b63fb8a0daf0b9b267bc0f075dde3ca0f45befb92dff8d6d0380c9a16f1665" exitCode=0 Mar 09 16:35:22 crc kubenswrapper[4831]: I0309 16:35:22.381943 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" event={"ID":"2da7f8be-0986-43a3-b7c9-8a98d20b2907","Type":"ContainerDied","Data":"e5b63fb8a0daf0b9b267bc0f075dde3ca0f45befb92dff8d6d0380c9a16f1665"} Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.685340 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.721425 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb"] Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.736534 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb"] Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.806845 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.806934 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.806966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807093 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdrzt\" (UniqueName: \"kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807120 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts\") pod \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\" (UID: \"2da7f8be-0986-43a3-b7c9-8a98d20b2907\") " Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807451 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807882 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.807945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.816641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt" (OuterVolumeSpecName: "kube-api-access-kdrzt") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "kube-api-access-kdrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.864559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.865076 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.866696 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts" (OuterVolumeSpecName: "scripts") pod "2da7f8be-0986-43a3-b7c9-8a98d20b2907" (UID: "2da7f8be-0986-43a3-b7c9-8a98d20b2907"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.909512 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.909548 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2da7f8be-0986-43a3-b7c9-8a98d20b2907-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.909558 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdrzt\" (UniqueName: \"kubernetes.io/projected/2da7f8be-0986-43a3-b7c9-8a98d20b2907-kube-api-access-kdrzt\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.909569 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da7f8be-0986-43a3-b7c9-8a98d20b2907-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:23 crc kubenswrapper[4831]: I0309 16:35:23.909578 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2da7f8be-0986-43a3-b7c9-8a98d20b2907-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.403919 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a1566cea5548a794b19d72b43b93bdd6280f130ccf7849c9e544f14e470526" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.403964 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zp4sb" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.894722 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p2smc"] Mar 09 16:35:24 crc kubenswrapper[4831]: E0309 16:35:24.895888 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da7f8be-0986-43a3-b7c9-8a98d20b2907" containerName="swift-ring-rebalance" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.895908 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da7f8be-0986-43a3-b7c9-8a98d20b2907" containerName="swift-ring-rebalance" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.896096 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da7f8be-0986-43a3-b7c9-8a98d20b2907" containerName="swift-ring-rebalance" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.902014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.904420 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p2smc"] Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.904894 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:24 crc kubenswrapper[4831]: I0309 16:35:24.905440 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2phq\" (UniqueName: \"kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040801 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.040829 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.142289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.142669 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.142787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.142893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.143091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2phq\" (UniqueName: \"kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.143213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.143689 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.143708 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.144151 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.147674 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.152852 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.167064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2phq\" (UniqueName: \"kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq\") pod \"swift-ring-rebalance-debug-p2smc\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.226363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.625359 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da7f8be-0986-43a3-b7c9-8a98d20b2907" path="/var/lib/kubelet/pods/2da7f8be-0986-43a3-b7c9-8a98d20b2907/volumes" Mar 09 16:35:25 crc kubenswrapper[4831]: I0309 16:35:25.717672 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p2smc"] Mar 09 16:35:25 crc kubenswrapper[4831]: W0309 16:35:25.725197 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f6e9cc_8371_4219_87a5_d9ecaceae439.slice/crio-e8e5ca9cd920cc63aa913d152140eb0d0b9a9cce55413e4846de309fa13a2c23 WatchSource:0}: Error finding container e8e5ca9cd920cc63aa913d152140eb0d0b9a9cce55413e4846de309fa13a2c23: Status 404 returned error can't find the container with id e8e5ca9cd920cc63aa913d152140eb0d0b9a9cce55413e4846de309fa13a2c23 Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.422469 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" event={"ID":"70f6e9cc-8371-4219-87a5-d9ecaceae439","Type":"ContainerStarted","Data":"0a22a67d5e3ef26de183d2eaf71f40976eb4f596ae122e72f6726a5f22e28f2d"} Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.422523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" event={"ID":"70f6e9cc-8371-4219-87a5-d9ecaceae439","Type":"ContainerStarted","Data":"e8e5ca9cd920cc63aa913d152140eb0d0b9a9cce55413e4846de309fa13a2c23"} Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.451037 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" podStartSLOduration=2.451020289 podStartE2EDuration="2.451020289s" podCreationTimestamp="2026-03-09 16:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:26.442420035 +0000 UTC m=+2253.576102468" watchObservedRunningTime="2026-03-09 16:35:26.451020289 +0000 UTC m=+2253.584702712" Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.456431 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.507932 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:26 crc kubenswrapper[4831]: I0309 16:35:26.696699 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:27 crc kubenswrapper[4831]: I0309 16:35:27.434373 4831 generic.go:334] "Generic (PLEG): container finished" podID="70f6e9cc-8371-4219-87a5-d9ecaceae439" containerID="0a22a67d5e3ef26de183d2eaf71f40976eb4f596ae122e72f6726a5f22e28f2d" exitCode=0 Mar 09 16:35:27 crc kubenswrapper[4831]: I0309 16:35:27.434435 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" event={"ID":"70f6e9cc-8371-4219-87a5-d9ecaceae439","Type":"ContainerDied","Data":"0a22a67d5e3ef26de183d2eaf71f40976eb4f596ae122e72f6726a5f22e28f2d"} Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.442470 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-25flg" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="registry-server" containerID="cri-o://297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e" gracePeriod=2 Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.879862 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.935968 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.936046 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.936138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2phq\" (UniqueName: \"kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.936173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.936213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.936280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf\") pod \"70f6e9cc-8371-4219-87a5-d9ecaceae439\" (UID: \"70f6e9cc-8371-4219-87a5-d9ecaceae439\") " Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.937988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.938119 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.951411 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq" (OuterVolumeSpecName: "kube-api-access-s2phq") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "kube-api-access-s2phq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.951567 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p2smc"] Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.962124 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p2smc"] Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.980097 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:28 crc kubenswrapper[4831]: I0309 16:35:28.992735 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts" (OuterVolumeSpecName: "scripts") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.007828 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "70f6e9cc-8371-4219-87a5-d9ecaceae439" (UID: "70f6e9cc-8371-4219-87a5-d9ecaceae439"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.033860 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038297 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44bth\" (UniqueName: \"kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth\") pod \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038393 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content\") pod \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities\") pod \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\" (UID: \"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6\") " Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038883 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2phq\" (UniqueName: \"kubernetes.io/projected/70f6e9cc-8371-4219-87a5-d9ecaceae439-kube-api-access-s2phq\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038897 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038905 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038914 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/70f6e9cc-8371-4219-87a5-d9ecaceae439-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038922 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/70f6e9cc-8371-4219-87a5-d9ecaceae439-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.038930 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f6e9cc-8371-4219-87a5-d9ecaceae439-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.040688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities" (OuterVolumeSpecName: "utilities") pod "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" (UID: "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.042668 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth" (OuterVolumeSpecName: "kube-api-access-44bth") pod "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" (UID: "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6"). InnerVolumeSpecName "kube-api-access-44bth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.140769 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.140817 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44bth\" (UniqueName: \"kubernetes.io/projected/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-kube-api-access-44bth\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.178822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" (UID: "cd03930f-0dc4-4c0c-aa34-7ee061af0cf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.244085 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.456106 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerID="297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e" exitCode=0 Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.456348 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerDied","Data":"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e"} Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.456547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25flg" event={"ID":"cd03930f-0dc4-4c0c-aa34-7ee061af0cf6","Type":"ContainerDied","Data":"6a207c9e5b672919e27fc1555b40be3ac0b1abf71fe2ed23f5ba72ecc4d36c7f"} Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.456571 4831 scope.go:117] "RemoveContainer" containerID="297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.456364 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25flg" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.462006 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e5ca9cd920cc63aa913d152140eb0d0b9a9cce55413e4846de309fa13a2c23" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.462055 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p2smc" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.480046 4831 scope.go:117] "RemoveContainer" containerID="10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.496673 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.501935 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-25flg"] Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.522118 4831 scope.go:117] "RemoveContainer" containerID="25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.539186 4831 scope.go:117] "RemoveContainer" containerID="297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e" Mar 09 16:35:29 crc kubenswrapper[4831]: E0309 16:35:29.539733 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e\": container with ID starting with 297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e not found: ID does not exist" containerID="297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.539788 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e"} err="failed to get container status \"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e\": rpc error: code = NotFound desc = could not find container \"297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e\": container with ID starting with 297739df21021618bab83bbcd0a5a28f202aac0b113e7c6b5543bb7d28175a3e not found: ID does not exist" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.539820 4831 scope.go:117] "RemoveContainer" containerID="10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a" Mar 09 16:35:29 crc kubenswrapper[4831]: E0309 16:35:29.540103 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a\": container with ID starting with 10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a not found: ID does not exist" containerID="10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.540135 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a"} err="failed to get container status \"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a\": rpc error: code = NotFound desc = could not find container \"10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a\": container with ID starting with 10838b955bf588a5d5f09fca92972ed3973c5e95410197904d7b8d93140acb9a not found: ID does not exist" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.540159 4831 scope.go:117] "RemoveContainer" containerID="25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c" Mar 09 16:35:29 crc kubenswrapper[4831]: E0309 16:35:29.540427 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c\": container with ID starting with 25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c not found: ID does not exist" containerID="25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.540461 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c"} err="failed to get container status \"25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c\": rpc error: code = NotFound desc = could not find container \"25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c\": container with ID starting with 25ef2d85b2e35a1341aa7f35895ddd6772b9b09057585aa85335f2678c6e787c not found: ID does not exist" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.626985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f6e9cc-8371-4219-87a5-d9ecaceae439" path="/var/lib/kubelet/pods/70f6e9cc-8371-4219-87a5-d9ecaceae439/volumes" Mar 09 16:35:29 crc kubenswrapper[4831]: I0309 16:35:29.627515 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" path="/var/lib/kubelet/pods/cd03930f-0dc4-4c0c-aa34-7ee061af0cf6/volumes" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.086771 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b"] Mar 09 16:35:30 crc kubenswrapper[4831]: E0309 16:35:30.087111 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="extract-utilities" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087130 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="extract-utilities" Mar 09 16:35:30 crc kubenswrapper[4831]: E0309 16:35:30.087151 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f6e9cc-8371-4219-87a5-d9ecaceae439" containerName="swift-ring-rebalance" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087160 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f6e9cc-8371-4219-87a5-d9ecaceae439" containerName="swift-ring-rebalance" Mar 09 16:35:30 crc kubenswrapper[4831]: E0309 16:35:30.087176 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="extract-content" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087186 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="extract-content" Mar 09 16:35:30 crc kubenswrapper[4831]: E0309 16:35:30.087204 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="registry-server" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087212 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="registry-server" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087420 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f6e9cc-8371-4219-87a5-d9ecaceae439" containerName="swift-ring-rebalance" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.087446 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd03930f-0dc4-4c0c-aa34-7ee061af0cf6" containerName="registry-server" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.088019 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.090248 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.091110 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.102037 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b"] Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155325 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155609 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlvj\" (UniqueName: \"kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.155796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.256940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlvj\" (UniqueName: \"kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.256986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.257027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.257066 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.257087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.257152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.257616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.258265 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.258393 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.260935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.260953 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.274050 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlvj\" (UniqueName: \"kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj\") pod \"swift-ring-rebalance-debug-p9j2b\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.434018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:30 crc kubenswrapper[4831]: I0309 16:35:30.890415 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b"] Mar 09 16:35:31 crc kubenswrapper[4831]: I0309 16:35:31.481843 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" event={"ID":"29e508bb-f8cb-4d42-9c75-14e63aee5511","Type":"ContainerStarted","Data":"28df16619f57c2008bd9ad9eb4b57306183a7d6a545323bd15576430bf91eac9"} Mar 09 16:35:31 crc kubenswrapper[4831]: I0309 16:35:31.482205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" event={"ID":"29e508bb-f8cb-4d42-9c75-14e63aee5511","Type":"ContainerStarted","Data":"e9bd303f617d8131641926ff9ec56d78e39b4451c7dc0a108919f38be8f0eea4"} Mar 09 16:35:31 crc kubenswrapper[4831]: I0309 16:35:31.498483 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" podStartSLOduration=1.498468551 podStartE2EDuration="1.498468551s" podCreationTimestamp="2026-03-09 16:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:31.495501947 +0000 UTC m=+2258.629184370" watchObservedRunningTime="2026-03-09 16:35:31.498468551 +0000 UTC m=+2258.632150974" Mar 09 16:35:32 crc kubenswrapper[4831]: I0309 16:35:32.491572 4831 generic.go:334] "Generic (PLEG): container finished" podID="29e508bb-f8cb-4d42-9c75-14e63aee5511" containerID="28df16619f57c2008bd9ad9eb4b57306183a7d6a545323bd15576430bf91eac9" exitCode=0 Mar 09 16:35:32 crc kubenswrapper[4831]: I0309 16:35:32.491645 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" event={"ID":"29e508bb-f8cb-4d42-9c75-14e63aee5511","Type":"ContainerDied","Data":"28df16619f57c2008bd9ad9eb4b57306183a7d6a545323bd15576430bf91eac9"} Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.018469 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.018536 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.791288 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.834442 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b"] Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.839349 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b"] Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911022 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911095 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911117 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911243 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlvj\" (UniqueName: \"kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911275 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf\") pod \"29e508bb-f8cb-4d42-9c75-14e63aee5511\" (UID: \"29e508bb-f8cb-4d42-9c75-14e63aee5511\") " Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.911808 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.912016 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.916964 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj" (OuterVolumeSpecName: "kube-api-access-ghlvj") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "kube-api-access-ghlvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.934615 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts" (OuterVolumeSpecName: "scripts") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.936667 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:33 crc kubenswrapper[4831]: I0309 16:35:33.937934 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "29e508bb-f8cb-4d42-9c75-14e63aee5511" (UID: "29e508bb-f8cb-4d42-9c75-14e63aee5511"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012814 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012854 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29e508bb-f8cb-4d42-9c75-14e63aee5511-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012864 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29e508bb-f8cb-4d42-9c75-14e63aee5511-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012873 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012882 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghlvj\" (UniqueName: \"kubernetes.io/projected/29e508bb-f8cb-4d42-9c75-14e63aee5511-kube-api-access-ghlvj\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.012895 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29e508bb-f8cb-4d42-9c75-14e63aee5511-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.508158 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9bd303f617d8131641926ff9ec56d78e39b4451c7dc0a108919f38be8f0eea4" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.508200 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p9j2b" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.993050 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6"] Mar 09 16:35:34 crc kubenswrapper[4831]: E0309 16:35:34.993813 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e508bb-f8cb-4d42-9c75-14e63aee5511" containerName="swift-ring-rebalance" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.993829 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e508bb-f8cb-4d42-9c75-14e63aee5511" containerName="swift-ring-rebalance" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.994017 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e508bb-f8cb-4d42-9c75-14e63aee5511" containerName="swift-ring-rebalance" Mar 09 16:35:34 crc kubenswrapper[4831]: I0309 16:35:34.994505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.000018 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.000081 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.012846 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6"] Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128511 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128557 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk89p\" (UniqueName: \"kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128582 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128619 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128643 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.128751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.230518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.230603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.230635 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk89p\" (UniqueName: \"kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.230662 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.230716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.231776 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.232166 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.232212 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.232634 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.236166 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.245096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.251723 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk89p\" (UniqueName: \"kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p\") pod \"swift-ring-rebalance-debug-lxhv6\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.318363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.625661 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e508bb-f8cb-4d42-9c75-14e63aee5511" path="/var/lib/kubelet/pods/29e508bb-f8cb-4d42-9c75-14e63aee5511/volumes" Mar 09 16:35:35 crc kubenswrapper[4831]: I0309 16:35:35.775054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6"] Mar 09 16:35:36 crc kubenswrapper[4831]: I0309 16:35:36.527144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" event={"ID":"bfe6475e-34d7-4bb2-952a-49a1bc2303bc","Type":"ContainerStarted","Data":"7543ef8bd08ef713307ca7dff175df34d884b79a59613ae619f0c92f0fe8dfd3"} Mar 09 16:35:36 crc kubenswrapper[4831]: I0309 16:35:36.527454 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" event={"ID":"bfe6475e-34d7-4bb2-952a-49a1bc2303bc","Type":"ContainerStarted","Data":"6f93ea23cfb2aae910095f8b55b079a77cf02707f05fefb9c722212c406192ec"} Mar 09 16:35:37 crc kubenswrapper[4831]: I0309 16:35:37.536421 4831 generic.go:334] "Generic (PLEG): container finished" podID="bfe6475e-34d7-4bb2-952a-49a1bc2303bc" containerID="7543ef8bd08ef713307ca7dff175df34d884b79a59613ae619f0c92f0fe8dfd3" exitCode=0 Mar 09 16:35:37 crc kubenswrapper[4831]: I0309 16:35:37.536566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" event={"ID":"bfe6475e-34d7-4bb2-952a-49a1bc2303bc","Type":"ContainerDied","Data":"7543ef8bd08ef713307ca7dff175df34d884b79a59613ae619f0c92f0fe8dfd3"} Mar 09 16:35:38 crc kubenswrapper[4831]: I0309 16:35:38.898996 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:38 crc kubenswrapper[4831]: I0309 16:35:38.928059 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6"] Mar 09 16:35:38 crc kubenswrapper[4831]: I0309 16:35:38.933981 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6"] Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.092798 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.092867 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.092911 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.092999 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.093049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.093136 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk89p\" (UniqueName: \"kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p\") pod \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\" (UID: \"bfe6475e-34d7-4bb2-952a-49a1bc2303bc\") " Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.093463 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.093674 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.093917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.100078 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p" (OuterVolumeSpecName: "kube-api-access-xk89p") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "kube-api-access-xk89p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.115844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts" (OuterVolumeSpecName: "scripts") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.118787 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.122628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bfe6475e-34d7-4bb2-952a-49a1bc2303bc" (UID: "bfe6475e-34d7-4bb2-952a-49a1bc2303bc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.194788 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.194821 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.194830 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.194839 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk89p\" (UniqueName: \"kubernetes.io/projected/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-kube-api-access-xk89p\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.194850 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfe6475e-34d7-4bb2-952a-49a1bc2303bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.557756 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f93ea23cfb2aae910095f8b55b079a77cf02707f05fefb9c722212c406192ec" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.557842 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lxhv6" Mar 09 16:35:39 crc kubenswrapper[4831]: I0309 16:35:39.629481 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe6475e-34d7-4bb2-952a-49a1bc2303bc" path="/var/lib/kubelet/pods/bfe6475e-34d7-4bb2-952a-49a1bc2303bc/volumes" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.065043 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7"] Mar 09 16:35:40 crc kubenswrapper[4831]: E0309 16:35:40.065737 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe6475e-34d7-4bb2-952a-49a1bc2303bc" containerName="swift-ring-rebalance" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.065750 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe6475e-34d7-4bb2-952a-49a1bc2303bc" containerName="swift-ring-rebalance" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.065884 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe6475e-34d7-4bb2-952a-49a1bc2303bc" containerName="swift-ring-rebalance" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.066338 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.068279 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.068281 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.076600 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7"] Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.209731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbrd\" (UniqueName: \"kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.209809 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.209850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.209887 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.210044 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.210129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311253 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311323 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbrd\" (UniqueName: \"kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.311678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.312244 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.312367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.315637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.315643 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.327195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbrd\" (UniqueName: \"kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd\") pod \"swift-ring-rebalance-debug-vtsn7\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.388768 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:40 crc kubenswrapper[4831]: W0309 16:35:40.810702 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a2c1b07_c190_4395_a2ad_1c5c76420ea9.slice/crio-a1c306fb33634eded2f3c06a2867b997b00b456196c6037dd33110d54f93190f WatchSource:0}: Error finding container a1c306fb33634eded2f3c06a2867b997b00b456196c6037dd33110d54f93190f: Status 404 returned error can't find the container with id a1c306fb33634eded2f3c06a2867b997b00b456196c6037dd33110d54f93190f Mar 09 16:35:40 crc kubenswrapper[4831]: I0309 16:35:40.814595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7"] Mar 09 16:35:41 crc kubenswrapper[4831]: I0309 16:35:41.578491 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" event={"ID":"7a2c1b07-c190-4395-a2ad-1c5c76420ea9","Type":"ContainerStarted","Data":"1b0b8fd2849e589c709d3cf40ccd7b8f89cf8f3745b0521db9878e3f513371ca"} Mar 09 16:35:41 crc kubenswrapper[4831]: I0309 16:35:41.578948 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" event={"ID":"7a2c1b07-c190-4395-a2ad-1c5c76420ea9","Type":"ContainerStarted","Data":"a1c306fb33634eded2f3c06a2867b997b00b456196c6037dd33110d54f93190f"} Mar 09 16:35:41 crc kubenswrapper[4831]: I0309 16:35:41.597593 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" podStartSLOduration=1.597563651 podStartE2EDuration="1.597563651s" podCreationTimestamp="2026-03-09 16:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:41.593282829 +0000 UTC m=+2268.726965252" watchObservedRunningTime="2026-03-09 16:35:41.597563651 +0000 UTC m=+2268.731246074" Mar 09 16:35:42 crc kubenswrapper[4831]: I0309 16:35:42.588005 4831 generic.go:334] "Generic (PLEG): container finished" podID="7a2c1b07-c190-4395-a2ad-1c5c76420ea9" containerID="1b0b8fd2849e589c709d3cf40ccd7b8f89cf8f3745b0521db9878e3f513371ca" exitCode=0 Mar 09 16:35:42 crc kubenswrapper[4831]: I0309 16:35:42.588076 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" event={"ID":"7a2c1b07-c190-4395-a2ad-1c5c76420ea9","Type":"ContainerDied","Data":"1b0b8fd2849e589c709d3cf40ccd7b8f89cf8f3745b0521db9878e3f513371ca"} Mar 09 16:35:43 crc kubenswrapper[4831]: I0309 16:35:43.899391 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:43 crc kubenswrapper[4831]: I0309 16:35:43.931733 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7"] Mar 09 16:35:43 crc kubenswrapper[4831]: I0309 16:35:43.937982 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7"] Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073587 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073671 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbrd\" (UniqueName: \"kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073728 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073781 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.073832 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf\") pod \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\" (UID: \"7a2c1b07-c190-4395-a2ad-1c5c76420ea9\") " Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.074491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.074732 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.101149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd" (OuterVolumeSpecName: "kube-api-access-2qbrd") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "kube-api-access-2qbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.102808 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts" (OuterVolumeSpecName: "scripts") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.105486 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.106451 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7a2c1b07-c190-4395-a2ad-1c5c76420ea9" (UID: "7a2c1b07-c190-4395-a2ad-1c5c76420ea9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175601 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qbrd\" (UniqueName: \"kubernetes.io/projected/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-kube-api-access-2qbrd\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175636 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175646 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175653 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175662 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.175670 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a2c1b07-c190-4395-a2ad-1c5c76420ea9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.608916 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1c306fb33634eded2f3c06a2867b997b00b456196c6037dd33110d54f93190f" Mar 09 16:35:44 crc kubenswrapper[4831]: I0309 16:35:44.608991 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vtsn7" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.074818 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2jb95"] Mar 09 16:35:45 crc kubenswrapper[4831]: E0309 16:35:45.075241 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c1b07-c190-4395-a2ad-1c5c76420ea9" containerName="swift-ring-rebalance" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.075260 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c1b07-c190-4395-a2ad-1c5c76420ea9" containerName="swift-ring-rebalance" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.075488 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2c1b07-c190-4395-a2ad-1c5c76420ea9" containerName="swift-ring-rebalance" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.076141 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.078044 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.078694 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.085009 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2jb95"] Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.189203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.189254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.190516 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.190631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.190736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.190791 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbdl\" (UniqueName: \"kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbdl\" (UniqueName: \"kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292632 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.292654 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.293548 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.293623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.293685 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.297618 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.297683 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.314125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbdl\" (UniqueName: \"kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl\") pod \"swift-ring-rebalance-debug-2jb95\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.393795 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.626222 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2c1b07-c190-4395-a2ad-1c5c76420ea9" path="/var/lib/kubelet/pods/7a2c1b07-c190-4395-a2ad-1c5c76420ea9/volumes" Mar 09 16:35:45 crc kubenswrapper[4831]: I0309 16:35:45.866130 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2jb95"] Mar 09 16:35:45 crc kubenswrapper[4831]: W0309 16:35:45.868370 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c17e33_1282_4394_9299_ec4cd5e5b357.slice/crio-f04e2486bd67646fc8dd3a11f34a5052c7183ef05a1c608f77a5b8cd56fde089 WatchSource:0}: Error finding container f04e2486bd67646fc8dd3a11f34a5052c7183ef05a1c608f77a5b8cd56fde089: Status 404 returned error can't find the container with id f04e2486bd67646fc8dd3a11f34a5052c7183ef05a1c608f77a5b8cd56fde089 Mar 09 16:35:46 crc kubenswrapper[4831]: I0309 16:35:46.628788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" event={"ID":"71c17e33-1282-4394-9299-ec4cd5e5b357","Type":"ContainerStarted","Data":"4e3475bb98648a936091e55878b4ecded93519e7a88193f8835cb22a03ebbf69"} Mar 09 16:35:46 crc kubenswrapper[4831]: I0309 16:35:46.629438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" event={"ID":"71c17e33-1282-4394-9299-ec4cd5e5b357","Type":"ContainerStarted","Data":"f04e2486bd67646fc8dd3a11f34a5052c7183ef05a1c608f77a5b8cd56fde089"} Mar 09 16:35:46 crc kubenswrapper[4831]: I0309 16:35:46.661651 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" podStartSLOduration=1.6616237809999999 podStartE2EDuration="1.661623781s" podCreationTimestamp="2026-03-09 16:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:46.656138606 +0000 UTC m=+2273.789821049" watchObservedRunningTime="2026-03-09 16:35:46.661623781 +0000 UTC m=+2273.795306214" Mar 09 16:35:47 crc kubenswrapper[4831]: I0309 16:35:47.659656 4831 generic.go:334] "Generic (PLEG): container finished" podID="71c17e33-1282-4394-9299-ec4cd5e5b357" containerID="4e3475bb98648a936091e55878b4ecded93519e7a88193f8835cb22a03ebbf69" exitCode=0 Mar 09 16:35:47 crc kubenswrapper[4831]: I0309 16:35:47.659762 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" event={"ID":"71c17e33-1282-4394-9299-ec4cd5e5b357","Type":"ContainerDied","Data":"4e3475bb98648a936091e55878b4ecded93519e7a88193f8835cb22a03ebbf69"} Mar 09 16:35:48 crc kubenswrapper[4831]: I0309 16:35:48.947886 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:48 crc kubenswrapper[4831]: I0309 16:35:48.987971 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2jb95"] Mar 09 16:35:48 crc kubenswrapper[4831]: I0309 16:35:48.996880 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2jb95"] Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145259 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145319 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjbdl\" (UniqueName: \"kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.145339 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift\") pod \"71c17e33-1282-4394-9299-ec4cd5e5b357\" (UID: \"71c17e33-1282-4394-9299-ec4cd5e5b357\") " Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.146503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.146742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.150639 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl" (OuterVolumeSpecName: "kube-api-access-pjbdl") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "kube-api-access-pjbdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.166449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.168133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts" (OuterVolumeSpecName: "scripts") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.177700 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "71c17e33-1282-4394-9299-ec4cd5e5b357" (UID: "71c17e33-1282-4394-9299-ec4cd5e5b357"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.247954 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.248001 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.248021 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71c17e33-1282-4394-9299-ec4cd5e5b357-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.248034 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71c17e33-1282-4394-9299-ec4cd5e5b357-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.248048 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjbdl\" (UniqueName: \"kubernetes.io/projected/71c17e33-1282-4394-9299-ec4cd5e5b357-kube-api-access-pjbdl\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.248060 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71c17e33-1282-4394-9299-ec4cd5e5b357-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.626742 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c17e33-1282-4394-9299-ec4cd5e5b357" path="/var/lib/kubelet/pods/71c17e33-1282-4394-9299-ec4cd5e5b357/volumes" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.676826 4831 scope.go:117] "RemoveContainer" containerID="4e3475bb98648a936091e55878b4ecded93519e7a88193f8835cb22a03ebbf69" Mar 09 16:35:49 crc kubenswrapper[4831]: I0309 16:35:49.676887 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2jb95" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.118067 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj"] Mar 09 16:35:50 crc kubenswrapper[4831]: E0309 16:35:50.118376 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c17e33-1282-4394-9299-ec4cd5e5b357" containerName="swift-ring-rebalance" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.118387 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c17e33-1282-4394-9299-ec4cd5e5b357" containerName="swift-ring-rebalance" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.118547 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c17e33-1282-4394-9299-ec4cd5e5b357" containerName="swift-ring-rebalance" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.119030 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.120790 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.121327 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.130632 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj"] Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160273 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160329 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160386 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.160459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588m2\" (UniqueName: \"kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262325 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.262355 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588m2\" (UniqueName: \"kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.263123 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.263327 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.263659 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.268244 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.268556 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.278091 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588m2\" (UniqueName: \"kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2\") pod \"swift-ring-rebalance-debug-4mcrj\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.454732 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:50 crc kubenswrapper[4831]: I0309 16:35:50.911287 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj"] Mar 09 16:35:51 crc kubenswrapper[4831]: I0309 16:35:51.701816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" event={"ID":"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45","Type":"ContainerStarted","Data":"95b0146eaa2253c3faa8461b3cd3bffc584d7a93938350d735aa2254e245fff9"} Mar 09 16:35:51 crc kubenswrapper[4831]: I0309 16:35:51.702342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" event={"ID":"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45","Type":"ContainerStarted","Data":"2936a05aad28d46c1ad905fa530faad6f04a31fbaa504dd85b6152f99e6d8573"} Mar 09 16:35:51 crc kubenswrapper[4831]: I0309 16:35:51.721233 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" podStartSLOduration=1.721207626 podStartE2EDuration="1.721207626s" podCreationTimestamp="2026-03-09 16:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:51.716604786 +0000 UTC m=+2278.850287229" watchObservedRunningTime="2026-03-09 16:35:51.721207626 +0000 UTC m=+2278.854890049" Mar 09 16:35:52 crc kubenswrapper[4831]: I0309 16:35:52.714875 4831 generic.go:334] "Generic (PLEG): container finished" podID="a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" containerID="95b0146eaa2253c3faa8461b3cd3bffc584d7a93938350d735aa2254e245fff9" exitCode=0 Mar 09 16:35:52 crc kubenswrapper[4831]: I0309 16:35:52.714937 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" event={"ID":"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45","Type":"ContainerDied","Data":"95b0146eaa2253c3faa8461b3cd3bffc584d7a93938350d735aa2254e245fff9"} Mar 09 16:35:53 crc kubenswrapper[4831]: I0309 16:35:53.971755 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.004901 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj"] Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.014673 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj"] Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588m2\" (UniqueName: \"kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155791 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.155918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf\") pod \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\" (UID: \"a9843871-8fd3-40b5-bb9a-cfa9e0e4be45\") " Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.157236 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.157265 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.161949 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2" (OuterVolumeSpecName: "kube-api-access-588m2") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "kube-api-access-588m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.176629 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts" (OuterVolumeSpecName: "scripts") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.178605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.179614 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" (UID: "a9843871-8fd3-40b5-bb9a-cfa9e0e4be45"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256931 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588m2\" (UniqueName: \"kubernetes.io/projected/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-kube-api-access-588m2\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256963 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256972 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256981 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256990 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.256998 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.730959 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2936a05aad28d46c1ad905fa530faad6f04a31fbaa504dd85b6152f99e6d8573" Mar 09 16:35:54 crc kubenswrapper[4831]: I0309 16:35:54.731041 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4mcrj" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.138591 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-br4kw"] Mar 09 16:35:55 crc kubenswrapper[4831]: E0309 16:35:55.138917 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" containerName="swift-ring-rebalance" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.139117 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" containerName="swift-ring-rebalance" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.139268 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" containerName="swift-ring-rebalance" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.139741 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.143474 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.148232 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.164895 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-br4kw"] Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170598 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170780 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.170806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4z7z\" (UniqueName: \"kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.271996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4z7z\" (UniqueName: \"kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272201 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.272993 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.273093 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.275487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.275903 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.291396 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4z7z\" (UniqueName: \"kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z\") pod \"swift-ring-rebalance-debug-br4kw\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.463129 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.626349 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9843871-8fd3-40b5-bb9a-cfa9e0e4be45" path="/var/lib/kubelet/pods/a9843871-8fd3-40b5-bb9a-cfa9e0e4be45/volumes" Mar 09 16:35:55 crc kubenswrapper[4831]: I0309 16:35:55.883733 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-br4kw"] Mar 09 16:35:56 crc kubenswrapper[4831]: I0309 16:35:56.751120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" event={"ID":"3b19a623-ee86-4c77-a564-4deb6e667666","Type":"ContainerStarted","Data":"ff6a748c63df20722ddde95c269ca16dc6f2c4ee7db4cd796a4e45c50c53a005"} Mar 09 16:35:56 crc kubenswrapper[4831]: I0309 16:35:56.751176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" event={"ID":"3b19a623-ee86-4c77-a564-4deb6e667666","Type":"ContainerStarted","Data":"d10dad865a230e4ee65901012178e886add8d7e9cf0bd24dd1fc0d728e399bd4"} Mar 09 16:35:56 crc kubenswrapper[4831]: I0309 16:35:56.768872 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" podStartSLOduration=1.768850182 podStartE2EDuration="1.768850182s" podCreationTimestamp="2026-03-09 16:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:35:56.766906117 +0000 UTC m=+2283.900588540" watchObservedRunningTime="2026-03-09 16:35:56.768850182 +0000 UTC m=+2283.902532625" Mar 09 16:35:57 crc kubenswrapper[4831]: I0309 16:35:57.766216 4831 generic.go:334] "Generic (PLEG): container finished" podID="3b19a623-ee86-4c77-a564-4deb6e667666" containerID="ff6a748c63df20722ddde95c269ca16dc6f2c4ee7db4cd796a4e45c50c53a005" exitCode=0 Mar 09 16:35:57 crc kubenswrapper[4831]: I0309 16:35:57.766800 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" event={"ID":"3b19a623-ee86-4c77-a564-4deb6e667666","Type":"ContainerDied","Data":"ff6a748c63df20722ddde95c269ca16dc6f2c4ee7db4cd796a4e45c50c53a005"} Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.016733 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.054212 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-br4kw"] Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.062596 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-br4kw"] Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126751 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126841 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4z7z\" (UniqueName: \"kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126889 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.126937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf\") pod \"3b19a623-ee86-4c77-a564-4deb6e667666\" (UID: \"3b19a623-ee86-4c77-a564-4deb6e667666\") " Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.127479 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.128012 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.132055 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z" (OuterVolumeSpecName: "kube-api-access-c4z7z") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "kube-api-access-c4z7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.149345 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.150864 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.154506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts" (OuterVolumeSpecName: "scripts") pod "3b19a623-ee86-4c77-a564-4deb6e667666" (UID: "3b19a623-ee86-4c77-a564-4deb6e667666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228601 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b19a623-ee86-4c77-a564-4deb6e667666-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228657 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228677 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4z7z\" (UniqueName: \"kubernetes.io/projected/3b19a623-ee86-4c77-a564-4deb6e667666-kube-api-access-c4z7z\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228701 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228723 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b19a623-ee86-4c77-a564-4deb6e667666-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.228739 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b19a623-ee86-4c77-a564-4deb6e667666-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.643046 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b19a623-ee86-4c77-a564-4deb6e667666" path="/var/lib/kubelet/pods/3b19a623-ee86-4c77-a564-4deb6e667666/volumes" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.783567 4831 scope.go:117] "RemoveContainer" containerID="ff6a748c63df20722ddde95c269ca16dc6f2c4ee7db4cd796a4e45c50c53a005" Mar 09 16:35:59 crc kubenswrapper[4831]: I0309 16:35:59.783645 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-br4kw" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.143543 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551236-6cqjf"] Mar 09 16:36:00 crc kubenswrapper[4831]: E0309 16:36:00.143935 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b19a623-ee86-4c77-a564-4deb6e667666" containerName="swift-ring-rebalance" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.143952 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b19a623-ee86-4c77-a564-4deb6e667666" containerName="swift-ring-rebalance" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.144107 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b19a623-ee86-4c77-a564-4deb6e667666" containerName="swift-ring-rebalance" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.145930 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.146139 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk67s\" (UniqueName: \"kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s\") pod \"auto-csr-approver-29551236-6cqjf\" (UID: \"1d23ea67-6361-4fa7-b66a-3406731b8b5d\") " pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.151503 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.151624 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.152355 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551236-6cqjf"] Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.152614 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.185069 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4h48"] Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.186032 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.189655 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.190355 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.196754 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4h48"] Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.247753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.247817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.247872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.247931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk67s\" (UniqueName: \"kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s\") pod \"auto-csr-approver-29551236-6cqjf\" (UID: \"1d23ea67-6361-4fa7-b66a-3406731b8b5d\") " pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.247973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.248063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkg6m\" (UniqueName: \"kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.248121 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.267508 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk67s\" (UniqueName: \"kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s\") pod \"auto-csr-approver-29551236-6cqjf\" (UID: \"1d23ea67-6361-4fa7-b66a-3406731b8b5d\") " pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349377 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkg6m\" (UniqueName: \"kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.349675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.350377 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.350395 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.352812 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.353376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.366775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkg6m\" (UniqueName: \"kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m\") pod \"swift-ring-rebalance-debug-w4h48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.470453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.508146 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.788633 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4h48"] Mar 09 16:36:00 crc kubenswrapper[4831]: W0309 16:36:00.800823 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bce700a_e0f8_444b_abbe_51e5a6c6fd48.slice/crio-bd7cdea2c677593a95a85690013d8f64123b5a40026a3ee9a02944565dd905a0 WatchSource:0}: Error finding container bd7cdea2c677593a95a85690013d8f64123b5a40026a3ee9a02944565dd905a0: Status 404 returned error can't find the container with id bd7cdea2c677593a95a85690013d8f64123b5a40026a3ee9a02944565dd905a0 Mar 09 16:36:00 crc kubenswrapper[4831]: I0309 16:36:00.930644 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551236-6cqjf"] Mar 09 16:36:00 crc kubenswrapper[4831]: W0309 16:36:00.944342 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d23ea67_6361_4fa7_b66a_3406731b8b5d.slice/crio-12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2 WatchSource:0}: Error finding container 12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2: Status 404 returned error can't find the container with id 12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2 Mar 09 16:36:01 crc kubenswrapper[4831]: I0309 16:36:01.806548 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" event={"ID":"1d23ea67-6361-4fa7-b66a-3406731b8b5d","Type":"ContainerStarted","Data":"12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2"} Mar 09 16:36:01 crc kubenswrapper[4831]: I0309 16:36:01.810144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" event={"ID":"0bce700a-e0f8-444b-abbe-51e5a6c6fd48","Type":"ContainerStarted","Data":"21684a69419ec4141daf231dbae65205810af78772b059a2f08a6b1b29efd88c"} Mar 09 16:36:01 crc kubenswrapper[4831]: I0309 16:36:01.810345 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" event={"ID":"0bce700a-e0f8-444b-abbe-51e5a6c6fd48","Type":"ContainerStarted","Data":"bd7cdea2c677593a95a85690013d8f64123b5a40026a3ee9a02944565dd905a0"} Mar 09 16:36:01 crc kubenswrapper[4831]: I0309 16:36:01.834637 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" podStartSLOduration=1.834617701 podStartE2EDuration="1.834617701s" podCreationTimestamp="2026-03-09 16:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:01.832692477 +0000 UTC m=+2288.966374900" watchObservedRunningTime="2026-03-09 16:36:01.834617701 +0000 UTC m=+2288.968300124" Mar 09 16:36:02 crc kubenswrapper[4831]: I0309 16:36:02.820932 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d23ea67-6361-4fa7-b66a-3406731b8b5d" containerID="f2fadcf8cac962c17d3b4775e23a348f950ff77c6ee60b8ddd91196a79fdf78d" exitCode=0 Mar 09 16:36:02 crc kubenswrapper[4831]: I0309 16:36:02.820991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" event={"ID":"1d23ea67-6361-4fa7-b66a-3406731b8b5d","Type":"ContainerDied","Data":"f2fadcf8cac962c17d3b4775e23a348f950ff77c6ee60b8ddd91196a79fdf78d"} Mar 09 16:36:02 crc kubenswrapper[4831]: I0309 16:36:02.824496 4831 generic.go:334] "Generic (PLEG): container finished" podID="0bce700a-e0f8-444b-abbe-51e5a6c6fd48" containerID="21684a69419ec4141daf231dbae65205810af78772b059a2f08a6b1b29efd88c" exitCode=0 Mar 09 16:36:02 crc kubenswrapper[4831]: I0309 16:36:02.824552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" event={"ID":"0bce700a-e0f8-444b-abbe-51e5a6c6fd48","Type":"ContainerDied","Data":"21684a69419ec4141daf231dbae65205810af78772b059a2f08a6b1b29efd88c"} Mar 09 16:36:03 crc kubenswrapper[4831]: I0309 16:36:03.018814 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:36:03 crc kubenswrapper[4831]: I0309 16:36:03.018873 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.217541 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.224389 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.266149 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4h48"] Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.273880 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4h48"] Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.415460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.415601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.415645 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.415679 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk67s\" (UniqueName: \"kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s\") pod \"1d23ea67-6361-4fa7-b66a-3406731b8b5d\" (UID: \"1d23ea67-6361-4fa7-b66a-3406731b8b5d\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.415755 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.416028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.416494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.416795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkg6m\" (UniqueName: \"kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m\") pod \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\" (UID: \"0bce700a-e0f8-444b-abbe-51e5a6c6fd48\") " Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.417140 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.417154 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.421473 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s" (OuterVolumeSpecName: "kube-api-access-rk67s") pod "1d23ea67-6361-4fa7-b66a-3406731b8b5d" (UID: "1d23ea67-6361-4fa7-b66a-3406731b8b5d"). InnerVolumeSpecName "kube-api-access-rk67s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.423779 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m" (OuterVolumeSpecName: "kube-api-access-gkg6m") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "kube-api-access-gkg6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.437671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts" (OuterVolumeSpecName: "scripts") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.437947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.438931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0bce700a-e0f8-444b-abbe-51e5a6c6fd48" (UID: "0bce700a-e0f8-444b-abbe-51e5a6c6fd48"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518589 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518631 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk67s\" (UniqueName: \"kubernetes.io/projected/1d23ea67-6361-4fa7-b66a-3406731b8b5d-kube-api-access-rk67s\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518648 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518663 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518677 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkg6m\" (UniqueName: \"kubernetes.io/projected/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-kube-api-access-gkg6m\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.518688 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0bce700a-e0f8-444b-abbe-51e5a6c6fd48-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.843881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" event={"ID":"1d23ea67-6361-4fa7-b66a-3406731b8b5d","Type":"ContainerDied","Data":"12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2"} Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.843931 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b5d0d54112c1b08235d45691326fafe3f79fa0e1caacd326d43095f9e6a5b2" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.843909 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551236-6cqjf" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.846728 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7cdea2c677593a95a85690013d8f64123b5a40026a3ee9a02944565dd905a0" Mar 09 16:36:04 crc kubenswrapper[4831]: I0309 16:36:04.846768 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4h48" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.320183 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551230-sz9vm"] Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.326500 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551230-sz9vm"] Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.385605 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r"] Mar 09 16:36:05 crc kubenswrapper[4831]: E0309 16:36:05.385952 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce700a-e0f8-444b-abbe-51e5a6c6fd48" containerName="swift-ring-rebalance" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.385975 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce700a-e0f8-444b-abbe-51e5a6c6fd48" containerName="swift-ring-rebalance" Mar 09 16:36:05 crc kubenswrapper[4831]: E0309 16:36:05.386012 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d23ea67-6361-4fa7-b66a-3406731b8b5d" containerName="oc" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.386021 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d23ea67-6361-4fa7-b66a-3406731b8b5d" containerName="oc" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.386199 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d23ea67-6361-4fa7-b66a-3406731b8b5d" containerName="oc" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.386221 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce700a-e0f8-444b-abbe-51e5a6c6fd48" containerName="swift-ring-rebalance" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.386717 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.388709 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.395695 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r"] Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.397057 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530452 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985tf\" (UniqueName: \"kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530513 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530569 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530601 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.530802 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.625309 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c2aa8a-7df6-4198-95c3-bdb7c661a845" path="/var/lib/kubelet/pods/01c2aa8a-7df6-4198-95c3-bdb7c661a845/volumes" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.626066 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce700a-e0f8-444b-abbe-51e5a6c6fd48" path="/var/lib/kubelet/pods/0bce700a-e0f8-444b-abbe-51e5a6c6fd48/volumes" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.631996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-985tf\" (UniqueName: \"kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632155 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632195 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.632583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.633607 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.633781 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.642918 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.642989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.660234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-985tf\" (UniqueName: \"kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf\") pod \"swift-ring-rebalance-debug-9cw9r\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:05 crc kubenswrapper[4831]: I0309 16:36:05.704574 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:06 crc kubenswrapper[4831]: I0309 16:36:06.113256 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r"] Mar 09 16:36:06 crc kubenswrapper[4831]: W0309 16:36:06.130844 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85e913e_4830_4ac8_8ed5_ef0e63d1c40e.slice/crio-be7b97ae9c31c469af63ac69bf68da1e6611f175f78f9e364a4c7294414d7471 WatchSource:0}: Error finding container be7b97ae9c31c469af63ac69bf68da1e6611f175f78f9e364a4c7294414d7471: Status 404 returned error can't find the container with id be7b97ae9c31c469af63ac69bf68da1e6611f175f78f9e364a4c7294414d7471 Mar 09 16:36:06 crc kubenswrapper[4831]: I0309 16:36:06.868467 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" event={"ID":"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e","Type":"ContainerStarted","Data":"788ad8ccff1c260743b9483baa43c992d1ba6b0be3b95200726e112b04e3cbe7"} Mar 09 16:36:06 crc kubenswrapper[4831]: I0309 16:36:06.868813 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" event={"ID":"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e","Type":"ContainerStarted","Data":"be7b97ae9c31c469af63ac69bf68da1e6611f175f78f9e364a4c7294414d7471"} Mar 09 16:36:06 crc kubenswrapper[4831]: I0309 16:36:06.890218 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" podStartSLOduration=1.890201223 podStartE2EDuration="1.890201223s" podCreationTimestamp="2026-03-09 16:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:06.890088239 +0000 UTC m=+2294.023770682" watchObservedRunningTime="2026-03-09 16:36:06.890201223 +0000 UTC m=+2294.023883646" Mar 09 16:36:07 crc kubenswrapper[4831]: I0309 16:36:07.876577 4831 generic.go:334] "Generic (PLEG): container finished" podID="e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" containerID="788ad8ccff1c260743b9483baa43c992d1ba6b0be3b95200726e112b04e3cbe7" exitCode=0 Mar 09 16:36:07 crc kubenswrapper[4831]: I0309 16:36:07.876848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" event={"ID":"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e","Type":"ContainerDied","Data":"788ad8ccff1c260743b9483baa43c992d1ba6b0be3b95200726e112b04e3cbe7"} Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.253107 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.284545 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r"] Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.290303 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r"] Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.406921 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407119 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407152 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-985tf\" (UniqueName: \"kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407201 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf\") pod \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\" (UID: \"e85e913e-4830-4ac8-8ed5-ef0e63d1c40e\") " Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.407874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.408157 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.412472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf" (OuterVolumeSpecName: "kube-api-access-985tf") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "kube-api-access-985tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.426345 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts" (OuterVolumeSpecName: "scripts") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.428894 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.439265 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" (UID: "e85e913e-4830-4ac8-8ed5-ef0e63d1c40e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508463 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508489 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508498 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508507 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-985tf\" (UniqueName: \"kubernetes.io/projected/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-kube-api-access-985tf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508517 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.508525 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.625952 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" path="/var/lib/kubelet/pods/e85e913e-4830-4ac8-8ed5-ef0e63d1c40e/volumes" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.893463 4831 scope.go:117] "RemoveContainer" containerID="788ad8ccff1c260743b9483baa43c992d1ba6b0be3b95200726e112b04e3cbe7" Mar 09 16:36:09 crc kubenswrapper[4831]: I0309 16:36:09.893511 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cw9r" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.429805 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh"] Mar 09 16:36:10 crc kubenswrapper[4831]: E0309 16:36:10.430376 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" containerName="swift-ring-rebalance" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.432474 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" containerName="swift-ring-rebalance" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.432990 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85e913e-4830-4ac8-8ed5-ef0e63d1c40e" containerName="swift-ring-rebalance" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.433749 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.436530 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.437485 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.438662 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh"] Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.524648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.524800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.525086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2hfv\" (UniqueName: \"kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.525132 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.525161 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.525240 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.626918 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2hfv\" (UniqueName: \"kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627377 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.627959 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.628022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.628559 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.632439 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.639841 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.644836 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2hfv\" (UniqueName: \"kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv\") pod \"swift-ring-rebalance-debug-qqqqh\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:10 crc kubenswrapper[4831]: I0309 16:36:10.753451 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:11 crc kubenswrapper[4831]: I0309 16:36:11.162024 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh"] Mar 09 16:36:11 crc kubenswrapper[4831]: I0309 16:36:11.917642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" event={"ID":"4b415ce3-6108-4479-9cec-e1b2640a1352","Type":"ContainerStarted","Data":"2ff015b92df50ca487a5d142510545241754fb9210ecbb6531a2fa3e1a16b0e5"} Mar 09 16:36:11 crc kubenswrapper[4831]: I0309 16:36:11.917976 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" event={"ID":"4b415ce3-6108-4479-9cec-e1b2640a1352","Type":"ContainerStarted","Data":"a1ec334501ce7da1e2ac89a8b7633b27bba281a3de1c341a4ba46885baee64ad"} Mar 09 16:36:11 crc kubenswrapper[4831]: I0309 16:36:11.936233 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" podStartSLOduration=1.9362163209999999 podStartE2EDuration="1.936216321s" podCreationTimestamp="2026-03-09 16:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:11.932362482 +0000 UTC m=+2299.066044905" watchObservedRunningTime="2026-03-09 16:36:11.936216321 +0000 UTC m=+2299.069898744" Mar 09 16:36:12 crc kubenswrapper[4831]: I0309 16:36:12.929041 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b415ce3-6108-4479-9cec-e1b2640a1352" containerID="2ff015b92df50ca487a5d142510545241754fb9210ecbb6531a2fa3e1a16b0e5" exitCode=0 Mar 09 16:36:12 crc kubenswrapper[4831]: I0309 16:36:12.929087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" event={"ID":"4b415ce3-6108-4479-9cec-e1b2640a1352","Type":"ContainerDied","Data":"2ff015b92df50ca487a5d142510545241754fb9210ecbb6531a2fa3e1a16b0e5"} Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.208421 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.264993 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh"] Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.271743 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh"] Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279739 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279779 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.279882 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2hfv\" (UniqueName: \"kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv\") pod \"4b415ce3-6108-4479-9cec-e1b2640a1352\" (UID: \"4b415ce3-6108-4479-9cec-e1b2640a1352\") " Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.280574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.281082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.291595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv" (OuterVolumeSpecName: "kube-api-access-b2hfv") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "kube-api-access-b2hfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.302732 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.306640 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts" (OuterVolumeSpecName: "scripts") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.307609 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b415ce3-6108-4479-9cec-e1b2640a1352" (UID: "4b415ce3-6108-4479-9cec-e1b2640a1352"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381297 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381330 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381340 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b415ce3-6108-4479-9cec-e1b2640a1352-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381349 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2hfv\" (UniqueName: \"kubernetes.io/projected/4b415ce3-6108-4479-9cec-e1b2640a1352-kube-api-access-b2hfv\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381358 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b415ce3-6108-4479-9cec-e1b2640a1352-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.381367 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b415ce3-6108-4479-9cec-e1b2640a1352-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.948502 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1ec334501ce7da1e2ac89a8b7633b27bba281a3de1c341a4ba46885baee64ad" Mar 09 16:36:14 crc kubenswrapper[4831]: I0309 16:36:14.948564 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qqqqh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.399273 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh"] Mar 09 16:36:15 crc kubenswrapper[4831]: E0309 16:36:15.399575 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b415ce3-6108-4479-9cec-e1b2640a1352" containerName="swift-ring-rebalance" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.399586 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b415ce3-6108-4479-9cec-e1b2640a1352" containerName="swift-ring-rebalance" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.399718 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b415ce3-6108-4479-9cec-e1b2640a1352" containerName="swift-ring-rebalance" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.400195 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.403369 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.403599 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.419747 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh"] Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.498892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.498944 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.498973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.499148 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.499310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpwgd\" (UniqueName: \"kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.499456 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601142 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpwgd\" (UniqueName: \"kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601517 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601631 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601665 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.601710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.602022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.602083 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.602346 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.604913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.605974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.626188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpwgd\" (UniqueName: \"kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd\") pod \"swift-ring-rebalance-debug-ztfrh\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.632044 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b415ce3-6108-4479-9cec-e1b2640a1352" path="/var/lib/kubelet/pods/4b415ce3-6108-4479-9cec-e1b2640a1352/volumes" Mar 09 16:36:15 crc kubenswrapper[4831]: I0309 16:36:15.717283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:16 crc kubenswrapper[4831]: I0309 16:36:16.135984 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh"] Mar 09 16:36:16 crc kubenswrapper[4831]: I0309 16:36:16.966922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" event={"ID":"71a4a5c8-a489-47b6-a5af-148189b49912","Type":"ContainerStarted","Data":"5eedd65195bc0927338cba5abab6f59ca85eb3135362986fa354017db85a51e2"} Mar 09 16:36:16 crc kubenswrapper[4831]: I0309 16:36:16.967269 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" event={"ID":"71a4a5c8-a489-47b6-a5af-148189b49912","Type":"ContainerStarted","Data":"e352d96ee11d6a7968606c7da4f0bdf278745b7346a654d9fbd5f4fc0af0f820"} Mar 09 16:36:16 crc kubenswrapper[4831]: I0309 16:36:16.985371 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" podStartSLOduration=1.9853488000000001 podStartE2EDuration="1.9853488s" podCreationTimestamp="2026-03-09 16:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:16.980237125 +0000 UTC m=+2304.113919558" watchObservedRunningTime="2026-03-09 16:36:16.9853488 +0000 UTC m=+2304.119031223" Mar 09 16:36:17 crc kubenswrapper[4831]: I0309 16:36:17.976311 4831 generic.go:334] "Generic (PLEG): container finished" podID="71a4a5c8-a489-47b6-a5af-148189b49912" containerID="5eedd65195bc0927338cba5abab6f59ca85eb3135362986fa354017db85a51e2" exitCode=0 Mar 09 16:36:17 crc kubenswrapper[4831]: I0309 16:36:17.976375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" event={"ID":"71a4a5c8-a489-47b6-a5af-148189b49912","Type":"ContainerDied","Data":"5eedd65195bc0927338cba5abab6f59ca85eb3135362986fa354017db85a51e2"} Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.262444 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.306639 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh"] Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.314787 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh"] Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360170 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360241 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpwgd\" (UniqueName: \"kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360453 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.360557 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts\") pod \"71a4a5c8-a489-47b6-a5af-148189b49912\" (UID: \"71a4a5c8-a489-47b6-a5af-148189b49912\") " Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.361715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.362423 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.386789 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd" (OuterVolumeSpecName: "kube-api-access-tpwgd") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "kube-api-access-tpwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.404712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.405114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts" (OuterVolumeSpecName: "scripts") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.416131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "71a4a5c8-a489-47b6-a5af-148189b49912" (UID: "71a4a5c8-a489-47b6-a5af-148189b49912"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462432 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462469 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462480 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a4a5c8-a489-47b6-a5af-148189b49912-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462495 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpwgd\" (UniqueName: \"kubernetes.io/projected/71a4a5c8-a489-47b6-a5af-148189b49912-kube-api-access-tpwgd\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462509 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a4a5c8-a489-47b6-a5af-148189b49912-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.462520 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a4a5c8-a489-47b6-a5af-148189b49912-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.626709 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a4a5c8-a489-47b6-a5af-148189b49912" path="/var/lib/kubelet/pods/71a4a5c8-a489-47b6-a5af-148189b49912/volumes" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.995578 4831 scope.go:117] "RemoveContainer" containerID="5eedd65195bc0927338cba5abab6f59ca85eb3135362986fa354017db85a51e2" Mar 09 16:36:19 crc kubenswrapper[4831]: I0309 16:36:19.995616 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ztfrh" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.457572 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9"] Mar 09 16:36:20 crc kubenswrapper[4831]: E0309 16:36:20.457942 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a4a5c8-a489-47b6-a5af-148189b49912" containerName="swift-ring-rebalance" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.457956 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a4a5c8-a489-47b6-a5af-148189b49912" containerName="swift-ring-rebalance" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.458098 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a4a5c8-a489-47b6-a5af-148189b49912" containerName="swift-ring-rebalance" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.458609 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.462704 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9"] Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.464008 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.464206 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578144 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578233 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578304 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.578441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgj6\" (UniqueName: \"kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679369 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679459 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679552 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.679707 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgj6\" (UniqueName: \"kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.680569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.681472 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.681754 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.687054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.695952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.703550 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgj6\" (UniqueName: \"kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6\") pod \"swift-ring-rebalance-debug-zn5v9\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:20 crc kubenswrapper[4831]: I0309 16:36:20.771917 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:21 crc kubenswrapper[4831]: I0309 16:36:21.207246 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9"] Mar 09 16:36:21 crc kubenswrapper[4831]: W0309 16:36:21.212316 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26c8fc7_3b95_4434_ba23_4e2b5e9cf10f.slice/crio-044cb11a7d8c0072aa5d90769e30e8bce7794a05e2b3b0a4e7976a242ebc4f00 WatchSource:0}: Error finding container 044cb11a7d8c0072aa5d90769e30e8bce7794a05e2b3b0a4e7976a242ebc4f00: Status 404 returned error can't find the container with id 044cb11a7d8c0072aa5d90769e30e8bce7794a05e2b3b0a4e7976a242ebc4f00 Mar 09 16:36:22 crc kubenswrapper[4831]: I0309 16:36:22.015917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" event={"ID":"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f","Type":"ContainerStarted","Data":"207960b74e8c50b5051ccc3f4f02d55e4a3ff4b989b9b9e018677ef18eddcc90"} Mar 09 16:36:22 crc kubenswrapper[4831]: I0309 16:36:22.016254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" event={"ID":"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f","Type":"ContainerStarted","Data":"044cb11a7d8c0072aa5d90769e30e8bce7794a05e2b3b0a4e7976a242ebc4f00"} Mar 09 16:36:22 crc kubenswrapper[4831]: I0309 16:36:22.031227 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" podStartSLOduration=2.031212935 podStartE2EDuration="2.031212935s" podCreationTimestamp="2026-03-09 16:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:22.029514537 +0000 UTC m=+2309.163196960" watchObservedRunningTime="2026-03-09 16:36:22.031212935 +0000 UTC m=+2309.164895358" Mar 09 16:36:23 crc kubenswrapper[4831]: I0309 16:36:23.025381 4831 generic.go:334] "Generic (PLEG): container finished" podID="b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" containerID="207960b74e8c50b5051ccc3f4f02d55e4a3ff4b989b9b9e018677ef18eddcc90" exitCode=0 Mar 09 16:36:23 crc kubenswrapper[4831]: I0309 16:36:23.025516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" event={"ID":"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f","Type":"ContainerDied","Data":"207960b74e8c50b5051ccc3f4f02d55e4a3ff4b989b9b9e018677ef18eddcc90"} Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.388710 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.430952 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431333 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431551 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbgj6\" (UniqueName: \"kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6\") pod \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\" (UID: \"b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f\") " Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.431785 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.432324 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.436330 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9"] Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.442025 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9"] Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.462980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.462983 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6" (OuterVolumeSpecName: "kube-api-access-jbgj6") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "kube-api-access-jbgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.464786 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts" (OuterVolumeSpecName: "scripts") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.487799 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" (UID: "b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533420 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533476 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533489 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533901 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbgj6\" (UniqueName: \"kubernetes.io/projected/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-kube-api-access-jbgj6\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533919 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:24 crc kubenswrapper[4831]: I0309 16:36:24.533931 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.049755 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044cb11a7d8c0072aa5d90769e30e8bce7794a05e2b3b0a4e7976a242ebc4f00" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.049823 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zn5v9" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.579156 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-777g6"] Mar 09 16:36:25 crc kubenswrapper[4831]: E0309 16:36:25.579440 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" containerName="swift-ring-rebalance" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.579452 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" containerName="swift-ring-rebalance" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.579597 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" containerName="swift-ring-rebalance" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.580037 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.581780 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.581992 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.593940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-777g6"] Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.629118 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f" path="/var/lib/kubelet/pods/b26c8fc7-3b95-4434-ba23-4e2b5e9cf10f/volumes" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndp7\" (UniqueName: \"kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656383 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.656681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.757915 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.758263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndp7\" (UniqueName: \"kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.758358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.758506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.758598 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.758741 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.759164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.759164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.759311 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.765230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.771068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.779012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndp7\" (UniqueName: \"kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7\") pod \"swift-ring-rebalance-debug-777g6\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:25 crc kubenswrapper[4831]: I0309 16:36:25.907589 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:26 crc kubenswrapper[4831]: I0309 16:36:26.343946 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-777g6"] Mar 09 16:36:27 crc kubenswrapper[4831]: I0309 16:36:27.070804 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" event={"ID":"8325e0b9-7e72-4e5e-ba5f-6560b604fc37","Type":"ContainerStarted","Data":"573627a8e53793c7c58e3860171be732e925331696e4bdf50897cb9d902d25ca"} Mar 09 16:36:27 crc kubenswrapper[4831]: I0309 16:36:27.071181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" event={"ID":"8325e0b9-7e72-4e5e-ba5f-6560b604fc37","Type":"ContainerStarted","Data":"7f9144ab7b8558121434fd3e60631fae2405909a1ed84bec3b76d910b5c512ed"} Mar 09 16:36:27 crc kubenswrapper[4831]: I0309 16:36:27.089421 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" podStartSLOduration=2.089383099 podStartE2EDuration="2.089383099s" podCreationTimestamp="2026-03-09 16:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:27.089361539 +0000 UTC m=+2314.223043972" watchObservedRunningTime="2026-03-09 16:36:27.089383099 +0000 UTC m=+2314.223065512" Mar 09 16:36:28 crc kubenswrapper[4831]: I0309 16:36:28.082831 4831 generic.go:334] "Generic (PLEG): container finished" podID="8325e0b9-7e72-4e5e-ba5f-6560b604fc37" containerID="573627a8e53793c7c58e3860171be732e925331696e4bdf50897cb9d902d25ca" exitCode=0 Mar 09 16:36:28 crc kubenswrapper[4831]: I0309 16:36:28.082978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" event={"ID":"8325e0b9-7e72-4e5e-ba5f-6560b604fc37","Type":"ContainerDied","Data":"573627a8e53793c7c58e3860171be732e925331696e4bdf50897cb9d902d25ca"} Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.332036 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.364442 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-777g6"] Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.367634 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-777g6"] Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.411587 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.411878 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.411972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.412154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dndp7\" (UniqueName: \"kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.412333 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.412538 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.412855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.413049 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.413176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.423571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7" (OuterVolumeSpecName: "kube-api-access-dndp7") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "kube-api-access-dndp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.443032 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:29 crc kubenswrapper[4831]: E0309 16:36:29.443835 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts podName:8325e0b9-7e72-4e5e-ba5f-6560b604fc37 nodeName:}" failed. No retries permitted until 2026-03-09 16:36:29.943803885 +0000 UTC m=+2317.077486308 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37") : error deleting /var/lib/kubelet/pods/8325e0b9-7e72-4e5e-ba5f-6560b604fc37/volume-subpaths: remove /var/lib/kubelet/pods/8325e0b9-7e72-4e5e-ba5f-6560b604fc37/volume-subpaths: no such file or directory Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.446668 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.514469 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.514504 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.514517 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dndp7\" (UniqueName: \"kubernetes.io/projected/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-kube-api-access-dndp7\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:29 crc kubenswrapper[4831]: I0309 16:36:29.514530 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.022228 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") pod \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\" (UID: \"8325e0b9-7e72-4e5e-ba5f-6560b604fc37\") " Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.022681 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts" (OuterVolumeSpecName: "scripts") pod "8325e0b9-7e72-4e5e-ba5f-6560b604fc37" (UID: "8325e0b9-7e72-4e5e-ba5f-6560b604fc37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.023000 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8325e0b9-7e72-4e5e-ba5f-6560b604fc37-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.104391 4831 scope.go:117] "RemoveContainer" containerID="573627a8e53793c7c58e3860171be732e925331696e4bdf50897cb9d902d25ca" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.104733 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-777g6" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.557360 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lt94h"] Mar 09 16:36:30 crc kubenswrapper[4831]: E0309 16:36:30.557818 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8325e0b9-7e72-4e5e-ba5f-6560b604fc37" containerName="swift-ring-rebalance" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.557835 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8325e0b9-7e72-4e5e-ba5f-6560b604fc37" containerName="swift-ring-rebalance" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.558903 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8325e0b9-7e72-4e5e-ba5f-6560b604fc37" containerName="swift-ring-rebalance" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.559845 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.563387 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.564545 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.565902 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lt94h"] Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633474 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hc6\" (UniqueName: \"kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633601 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633735 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633798 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.633830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.734990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.735555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.735639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.735797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hc6\" (UniqueName: \"kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.735925 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.736035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.736759 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.738557 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.739183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.742504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.751822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.764327 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hc6\" (UniqueName: \"kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6\") pod \"swift-ring-rebalance-debug-lt94h\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:30 crc kubenswrapper[4831]: I0309 16:36:30.882493 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:31 crc kubenswrapper[4831]: I0309 16:36:31.367331 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lt94h"] Mar 09 16:36:31 crc kubenswrapper[4831]: I0309 16:36:31.630934 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8325e0b9-7e72-4e5e-ba5f-6560b604fc37" path="/var/lib/kubelet/pods/8325e0b9-7e72-4e5e-ba5f-6560b604fc37/volumes" Mar 09 16:36:32 crc kubenswrapper[4831]: I0309 16:36:32.136875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" event={"ID":"2e6e7a1b-ea5f-423c-865a-818eac03253d","Type":"ContainerStarted","Data":"6be60a4f53b65e5805149cb7281b7cdc37aecb3973b3e604398dfc9ef64783bc"} Mar 09 16:36:32 crc kubenswrapper[4831]: I0309 16:36:32.137284 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" event={"ID":"2e6e7a1b-ea5f-423c-865a-818eac03253d","Type":"ContainerStarted","Data":"278828fbcf42f6bdd8c47fd32bdda95b69aa356e2472083cdc130190833d6e62"} Mar 09 16:36:32 crc kubenswrapper[4831]: I0309 16:36:32.160058 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" podStartSLOduration=2.160040688 podStartE2EDuration="2.160040688s" podCreationTimestamp="2026-03-09 16:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:32.152419092 +0000 UTC m=+2319.286101535" watchObservedRunningTime="2026-03-09 16:36:32.160040688 +0000 UTC m=+2319.293723101" Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.018377 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.019595 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.019965 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.021175 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.021302 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" gracePeriod=600 Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.160433 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" exitCode=0 Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.160502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5"} Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.160548 4831 scope.go:117] "RemoveContainer" containerID="258b09aa0807bb4f3c0676f22914bbc545a76966e9075840cca57fa0980ae55e" Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.163004 4831 generic.go:334] "Generic (PLEG): container finished" podID="2e6e7a1b-ea5f-423c-865a-818eac03253d" containerID="6be60a4f53b65e5805149cb7281b7cdc37aecb3973b3e604398dfc9ef64783bc" exitCode=0 Mar 09 16:36:33 crc kubenswrapper[4831]: I0309 16:36:33.163035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" event={"ID":"2e6e7a1b-ea5f-423c-865a-818eac03253d","Type":"ContainerDied","Data":"6be60a4f53b65e5805149cb7281b7cdc37aecb3973b3e604398dfc9ef64783bc"} Mar 09 16:36:33 crc kubenswrapper[4831]: E0309 16:36:33.228806 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.178599 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:36:34 crc kubenswrapper[4831]: E0309 16:36:34.179671 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.482258 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.519300 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lt94h"] Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.524516 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lt94h"] Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598183 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hc6\" (UniqueName: \"kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598323 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598423 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.598530 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf\") pod \"2e6e7a1b-ea5f-423c-865a-818eac03253d\" (UID: \"2e6e7a1b-ea5f-423c-865a-818eac03253d\") " Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.599254 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.599741 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.612631 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6" (OuterVolumeSpecName: "kube-api-access-l6hc6") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "kube-api-access-l6hc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.617674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts" (OuterVolumeSpecName: "scripts") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.624956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.625836 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2e6e7a1b-ea5f-423c-865a-818eac03253d" (UID: "2e6e7a1b-ea5f-423c-865a-818eac03253d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699588 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699621 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e6e7a1b-ea5f-423c-865a-818eac03253d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699634 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699646 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hc6\" (UniqueName: \"kubernetes.io/projected/2e6e7a1b-ea5f-423c-865a-818eac03253d-kube-api-access-l6hc6\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699658 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e6e7a1b-ea5f-423c-865a-818eac03253d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:34 crc kubenswrapper[4831]: I0309 16:36:34.699697 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e6e7a1b-ea5f-423c-865a-818eac03253d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.191602 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278828fbcf42f6bdd8c47fd32bdda95b69aa356e2472083cdc130190833d6e62" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.191695 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lt94h" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.625971 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6e7a1b-ea5f-423c-865a-818eac03253d" path="/var/lib/kubelet/pods/2e6e7a1b-ea5f-423c-865a-818eac03253d/volumes" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.669217 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz"] Mar 09 16:36:35 crc kubenswrapper[4831]: E0309 16:36:35.669599 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6e7a1b-ea5f-423c-865a-818eac03253d" containerName="swift-ring-rebalance" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.669618 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6e7a1b-ea5f-423c-865a-818eac03253d" containerName="swift-ring-rebalance" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.669747 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6e7a1b-ea5f-423c-865a-818eac03253d" containerName="swift-ring-rebalance" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.670869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.672961 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.672977 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.680691 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz"] Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.714610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.714685 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz4kp\" (UniqueName: \"kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.715153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.715415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.715513 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.715754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.817893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.818042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.818095 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.818182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.818226 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz4kp\" (UniqueName: \"kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.818281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.819115 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.819614 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.819854 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.824968 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.825009 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.835207 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz4kp\" (UniqueName: \"kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp\") pod \"swift-ring-rebalance-debug-cdlzz\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:35 crc kubenswrapper[4831]: I0309 16:36:35.989971 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:36 crc kubenswrapper[4831]: I0309 16:36:36.434889 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz"] Mar 09 16:36:37 crc kubenswrapper[4831]: I0309 16:36:37.225638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" event={"ID":"cb8c29aa-d3f9-4ef3-b502-dd2948375dee","Type":"ContainerStarted","Data":"df6d241fdb1cc30609100896af6197519ae6960704f499b1cd64dfe3c0f5d254"} Mar 09 16:36:37 crc kubenswrapper[4831]: I0309 16:36:37.225969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" event={"ID":"cb8c29aa-d3f9-4ef3-b502-dd2948375dee","Type":"ContainerStarted","Data":"c531f8a193bd00df62019e767dd75b652a2d92eb094f0c56d5af29b687379b35"} Mar 09 16:36:37 crc kubenswrapper[4831]: I0309 16:36:37.247014 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" podStartSLOduration=2.246987398 podStartE2EDuration="2.246987398s" podCreationTimestamp="2026-03-09 16:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:37.246782142 +0000 UTC m=+2324.380464565" watchObservedRunningTime="2026-03-09 16:36:37.246987398 +0000 UTC m=+2324.380669861" Mar 09 16:36:38 crc kubenswrapper[4831]: I0309 16:36:38.236695 4831 generic.go:334] "Generic (PLEG): container finished" podID="cb8c29aa-d3f9-4ef3-b502-dd2948375dee" containerID="df6d241fdb1cc30609100896af6197519ae6960704f499b1cd64dfe3c0f5d254" exitCode=0 Mar 09 16:36:38 crc kubenswrapper[4831]: I0309 16:36:38.236757 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" event={"ID":"cb8c29aa-d3f9-4ef3-b502-dd2948375dee","Type":"ContainerDied","Data":"df6d241fdb1cc30609100896af6197519ae6960704f499b1cd64dfe3c0f5d254"} Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.886164 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.919687 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz"] Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.925855 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz"] Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990761 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz4kp\" (UniqueName: \"kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990881 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.990969 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf\") pod \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\" (UID: \"cb8c29aa-d3f9-4ef3-b502-dd2948375dee\") " Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.991482 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.992196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:39 crc kubenswrapper[4831]: I0309 16:36:39.996569 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp" (OuterVolumeSpecName: "kube-api-access-sz4kp") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "kube-api-access-sz4kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.012514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts" (OuterVolumeSpecName: "scripts") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.016647 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.023857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb8c29aa-d3f9-4ef3-b502-dd2948375dee" (UID: "cb8c29aa-d3f9-4ef3-b502-dd2948375dee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093318 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093374 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093423 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093442 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz4kp\" (UniqueName: \"kubernetes.io/projected/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-kube-api-access-sz4kp\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093462 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.093480 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb8c29aa-d3f9-4ef3-b502-dd2948375dee-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.260064 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c531f8a193bd00df62019e767dd75b652a2d92eb094f0c56d5af29b687379b35" Mar 09 16:36:40 crc kubenswrapper[4831]: I0309 16:36:40.260092 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cdlzz" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.068197 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v"] Mar 09 16:36:41 crc kubenswrapper[4831]: E0309 16:36:41.068850 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8c29aa-d3f9-4ef3-b502-dd2948375dee" containerName="swift-ring-rebalance" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.068872 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8c29aa-d3f9-4ef3-b502-dd2948375dee" containerName="swift-ring-rebalance" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.069133 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8c29aa-d3f9-4ef3-b502-dd2948375dee" containerName="swift-ring-rebalance" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.070334 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.072669 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.075204 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.083251 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v"] Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.207964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.208017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.208781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.208966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhswc\" (UniqueName: \"kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.209163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.209206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.310534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.310714 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhswc\" (UniqueName: \"kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.310845 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.310900 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.310977 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.311019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.311458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.311574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.312740 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.316177 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.317963 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.327184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhswc\" (UniqueName: \"kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc\") pod \"swift-ring-rebalance-debug-zfx2v\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.407804 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.632481 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb8c29aa-d3f9-4ef3-b502-dd2948375dee" path="/var/lib/kubelet/pods/cb8c29aa-d3f9-4ef3-b502-dd2948375dee/volumes" Mar 09 16:36:41 crc kubenswrapper[4831]: I0309 16:36:41.927896 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v"] Mar 09 16:36:42 crc kubenswrapper[4831]: I0309 16:36:42.281011 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" event={"ID":"0f642a93-a9b7-492b-b363-598bf435e5cb","Type":"ContainerStarted","Data":"09f1853d56c4d9bb0e109dd13097d4c4c8d2ef0d5f31a20039457d19447577fb"} Mar 09 16:36:42 crc kubenswrapper[4831]: I0309 16:36:42.281059 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" event={"ID":"0f642a93-a9b7-492b-b363-598bf435e5cb","Type":"ContainerStarted","Data":"8fa31e833cda8ff730cc282462986536832578fa1400283f54ec420650416916"} Mar 09 16:36:42 crc kubenswrapper[4831]: I0309 16:36:42.300079 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" podStartSLOduration=1.300032687 podStartE2EDuration="1.300032687s" podCreationTimestamp="2026-03-09 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:42.297950188 +0000 UTC m=+2329.431632631" watchObservedRunningTime="2026-03-09 16:36:42.300032687 +0000 UTC m=+2329.433715120" Mar 09 16:36:44 crc kubenswrapper[4831]: I0309 16:36:44.297602 4831 generic.go:334] "Generic (PLEG): container finished" podID="0f642a93-a9b7-492b-b363-598bf435e5cb" containerID="09f1853d56c4d9bb0e109dd13097d4c4c8d2ef0d5f31a20039457d19447577fb" exitCode=0 Mar 09 16:36:44 crc kubenswrapper[4831]: I0309 16:36:44.297680 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" event={"ID":"0f642a93-a9b7-492b-b363-598bf435e5cb","Type":"ContainerDied","Data":"09f1853d56c4d9bb0e109dd13097d4c4c8d2ef0d5f31a20039457d19447577fb"} Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.632721 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.666715 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v"] Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.672906 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v"] Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.784819 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.784972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785007 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhswc\" (UniqueName: \"kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785071 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785107 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785134 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf\") pod \"0f642a93-a9b7-492b-b363-598bf435e5cb\" (UID: \"0f642a93-a9b7-492b-b363-598bf435e5cb\") " Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.785918 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.789895 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc" (OuterVolumeSpecName: "kube-api-access-vhswc") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "kube-api-access-vhswc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.807222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts" (OuterVolumeSpecName: "scripts") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.809296 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.809585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0f642a93-a9b7-492b-b363-598bf435e5cb" (UID: "0f642a93-a9b7-492b-b363-598bf435e5cb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.886982 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f642a93-a9b7-492b-b363-598bf435e5cb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.887017 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.887048 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhswc\" (UniqueName: \"kubernetes.io/projected/0f642a93-a9b7-492b-b363-598bf435e5cb-kube-api-access-vhswc\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.887060 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.887069 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f642a93-a9b7-492b-b363-598bf435e5cb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:45 crc kubenswrapper[4831]: I0309 16:36:45.887077 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f642a93-a9b7-492b-b363-598bf435e5cb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.318447 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa31e833cda8ff730cc282462986536832578fa1400283f54ec420650416916" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.318553 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zfx2v" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.618274 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:36:46 crc kubenswrapper[4831]: E0309 16:36:46.618588 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.802192 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz"] Mar 09 16:36:46 crc kubenswrapper[4831]: E0309 16:36:46.802482 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f642a93-a9b7-492b-b363-598bf435e5cb" containerName="swift-ring-rebalance" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.802497 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f642a93-a9b7-492b-b363-598bf435e5cb" containerName="swift-ring-rebalance" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.802657 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f642a93-a9b7-492b-b363-598bf435e5cb" containerName="swift-ring-rebalance" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.803151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.807508 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.807632 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.812992 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz"] Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4m4c\" (UniqueName: \"kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901625 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:46 crc kubenswrapper[4831]: I0309 16:36:46.901672 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003449 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003568 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003621 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003644 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4m4c\" (UniqueName: \"kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.003701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.004363 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.004720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.004862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.008213 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.008616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.019280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4m4c\" (UniqueName: \"kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c\") pod \"swift-ring-rebalance-debug-9mxwz\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.130710 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.400381 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz"] Mar 09 16:36:47 crc kubenswrapper[4831]: I0309 16:36:47.626667 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f642a93-a9b7-492b-b363-598bf435e5cb" path="/var/lib/kubelet/pods/0f642a93-a9b7-492b-b363-598bf435e5cb/volumes" Mar 09 16:36:48 crc kubenswrapper[4831]: I0309 16:36:48.342171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" event={"ID":"8bf0265a-ec84-4e5a-879c-a7b2d792be85","Type":"ContainerStarted","Data":"ac1c0f5ccf6f36c3c274f744e8b1b005b318a35f212e51717d7dbe9a43af25be"} Mar 09 16:36:48 crc kubenswrapper[4831]: I0309 16:36:48.342217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" event={"ID":"8bf0265a-ec84-4e5a-879c-a7b2d792be85","Type":"ContainerStarted","Data":"bc0e898326266bb7c355fc4d8ef6cadc404b16272c0d7bd398ca325e9e9d20a1"} Mar 09 16:36:48 crc kubenswrapper[4831]: I0309 16:36:48.361803 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" podStartSLOduration=2.361786018 podStartE2EDuration="2.361786018s" podCreationTimestamp="2026-03-09 16:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:48.355226272 +0000 UTC m=+2335.488908715" watchObservedRunningTime="2026-03-09 16:36:48.361786018 +0000 UTC m=+2335.495468441" Mar 09 16:36:49 crc kubenswrapper[4831]: I0309 16:36:49.352299 4831 generic.go:334] "Generic (PLEG): container finished" podID="8bf0265a-ec84-4e5a-879c-a7b2d792be85" containerID="ac1c0f5ccf6f36c3c274f744e8b1b005b318a35f212e51717d7dbe9a43af25be" exitCode=0 Mar 09 16:36:49 crc kubenswrapper[4831]: I0309 16:36:49.352661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" event={"ID":"8bf0265a-ec84-4e5a-879c-a7b2d792be85","Type":"ContainerDied","Data":"ac1c0f5ccf6f36c3c274f744e8b1b005b318a35f212e51717d7dbe9a43af25be"} Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.654734 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.688530 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz"] Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.691325 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz"] Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765096 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765157 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4m4c\" (UniqueName: \"kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765269 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765359 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices\") pod \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\" (UID: \"8bf0265a-ec84-4e5a-879c-a7b2d792be85\") " Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.765893 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.766707 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.774984 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c" (OuterVolumeSpecName: "kube-api-access-x4m4c") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "kube-api-access-x4m4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.791454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts" (OuterVolumeSpecName: "scripts") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.793755 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.800258 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8bf0265a-ec84-4e5a-879c-a7b2d792be85" (UID: "8bf0265a-ec84-4e5a-879c-a7b2d792be85"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867681 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867737 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bf0265a-ec84-4e5a-879c-a7b2d792be85-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867750 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867761 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4m4c\" (UniqueName: \"kubernetes.io/projected/8bf0265a-ec84-4e5a-879c-a7b2d792be85-kube-api-access-x4m4c\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867778 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bf0265a-ec84-4e5a-879c-a7b2d792be85-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:50 crc kubenswrapper[4831]: I0309 16:36:50.867792 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bf0265a-ec84-4e5a-879c-a7b2d792be85-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.375055 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0e898326266bb7c355fc4d8ef6cadc404b16272c0d7bd398ca325e9e9d20a1" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.375110 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9mxwz" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.636632 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf0265a-ec84-4e5a-879c-a7b2d792be85" path="/var/lib/kubelet/pods/8bf0265a-ec84-4e5a-879c-a7b2d792be85/volumes" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.845143 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rg94h"] Mar 09 16:36:51 crc kubenswrapper[4831]: E0309 16:36:51.845751 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf0265a-ec84-4e5a-879c-a7b2d792be85" containerName="swift-ring-rebalance" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.845769 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf0265a-ec84-4e5a-879c-a7b2d792be85" containerName="swift-ring-rebalance" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.845957 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf0265a-ec84-4e5a-879c-a7b2d792be85" containerName="swift-ring-rebalance" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.846685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.850082 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.850634 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.870849 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rg94h"] Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.983774 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.984219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bdm\" (UniqueName: \"kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.984258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.984298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.984374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:51 crc kubenswrapper[4831]: I0309 16:36:51.984523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.086183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087091 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087210 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087461 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087643 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bdm\" (UniqueName: \"kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.087995 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.088201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.093059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.093068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.111167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bdm\" (UniqueName: \"kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm\") pod \"swift-ring-rebalance-debug-rg94h\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.178967 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:52 crc kubenswrapper[4831]: I0309 16:36:52.660327 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rg94h"] Mar 09 16:36:53 crc kubenswrapper[4831]: I0309 16:36:53.392166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" event={"ID":"803d6585-a046-4dba-bd78-9b7d7e5c87be","Type":"ContainerStarted","Data":"040fd1818ff8b62ebd3f5eca84edbfcca285d8ec196b648f17c676722773f7b8"} Mar 09 16:36:53 crc kubenswrapper[4831]: I0309 16:36:53.392215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" event={"ID":"803d6585-a046-4dba-bd78-9b7d7e5c87be","Type":"ContainerStarted","Data":"57c486e5c52d12e83d2d9e63e8fd3ceeaefd41528a378dbb049d51048b2d5fca"} Mar 09 16:36:53 crc kubenswrapper[4831]: I0309 16:36:53.411220 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" podStartSLOduration=2.411199924 podStartE2EDuration="2.411199924s" podCreationTimestamp="2026-03-09 16:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:53.410180955 +0000 UTC m=+2340.543863378" watchObservedRunningTime="2026-03-09 16:36:53.411199924 +0000 UTC m=+2340.544882347" Mar 09 16:36:54 crc kubenswrapper[4831]: I0309 16:36:54.404791 4831 generic.go:334] "Generic (PLEG): container finished" podID="803d6585-a046-4dba-bd78-9b7d7e5c87be" containerID="040fd1818ff8b62ebd3f5eca84edbfcca285d8ec196b648f17c676722773f7b8" exitCode=0 Mar 09 16:36:54 crc kubenswrapper[4831]: I0309 16:36:54.404936 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" event={"ID":"803d6585-a046-4dba-bd78-9b7d7e5c87be","Type":"ContainerDied","Data":"040fd1818ff8b62ebd3f5eca84edbfcca285d8ec196b648f17c676722773f7b8"} Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.707952 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.757261 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rg94h"] Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.765200 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rg94h"] Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.858799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.858894 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.859056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.859142 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.859202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7bdm\" (UniqueName: \"kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.859308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf\") pod \"803d6585-a046-4dba-bd78-9b7d7e5c87be\" (UID: \"803d6585-a046-4dba-bd78-9b7d7e5c87be\") " Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.859977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.860774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.867518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm" (OuterVolumeSpecName: "kube-api-access-b7bdm") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "kube-api-access-b7bdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.885374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts" (OuterVolumeSpecName: "scripts") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.887108 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.889044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "803d6585-a046-4dba-bd78-9b7d7e5c87be" (UID: "803d6585-a046-4dba-bd78-9b7d7e5c87be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961114 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/803d6585-a046-4dba-bd78-9b7d7e5c87be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961150 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961160 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/803d6585-a046-4dba-bd78-9b7d7e5c87be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961171 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961180 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7bdm\" (UniqueName: \"kubernetes.io/projected/803d6585-a046-4dba-bd78-9b7d7e5c87be-kube-api-access-b7bdm\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:55 crc kubenswrapper[4831]: I0309 16:36:55.961189 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/803d6585-a046-4dba-bd78-9b7d7e5c87be-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.428826 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c486e5c52d12e83d2d9e63e8fd3ceeaefd41528a378dbb049d51048b2d5fca" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.428935 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rg94h" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.879885 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv"] Mar 09 16:36:56 crc kubenswrapper[4831]: E0309 16:36:56.880540 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803d6585-a046-4dba-bd78-9b7d7e5c87be" containerName="swift-ring-rebalance" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.880557 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="803d6585-a046-4dba-bd78-9b7d7e5c87be" containerName="swift-ring-rebalance" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.880680 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="803d6585-a046-4dba-bd78-9b7d7e5c87be" containerName="swift-ring-rebalance" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.881148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.883530 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.883656 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:36:56 crc kubenswrapper[4831]: I0309 16:36:56.886847 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv"] Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jw54\" (UniqueName: \"kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078470 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.078516 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179467 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179550 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179627 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179652 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.179671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jw54\" (UniqueName: \"kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.180511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.180573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.181229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.183574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.186681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.200755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jw54\" (UniqueName: \"kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54\") pod \"swift-ring-rebalance-debug-q7jwv\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.497342 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.618010 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:36:57 crc kubenswrapper[4831]: E0309 16:36:57.618715 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.627157 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803d6585-a046-4dba-bd78-9b7d7e5c87be" path="/var/lib/kubelet/pods/803d6585-a046-4dba-bd78-9b7d7e5c87be/volumes" Mar 09 16:36:57 crc kubenswrapper[4831]: I0309 16:36:57.974064 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv"] Mar 09 16:36:57 crc kubenswrapper[4831]: W0309 16:36:57.979585 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b99364_3b2a_4004_96ab_eb074a3c642c.slice/crio-40c579afe82430c7c1a72b9b8329871941e04497a596d6946b4bdcc3022c201b WatchSource:0}: Error finding container 40c579afe82430c7c1a72b9b8329871941e04497a596d6946b4bdcc3022c201b: Status 404 returned error can't find the container with id 40c579afe82430c7c1a72b9b8329871941e04497a596d6946b4bdcc3022c201b Mar 09 16:36:58 crc kubenswrapper[4831]: I0309 16:36:58.447953 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" event={"ID":"37b99364-3b2a-4004-96ab-eb074a3c642c","Type":"ContainerStarted","Data":"5de2a78cd4a02d6f6f16c12e808e08684ea8582034c5b91977cf024017b45d62"} Mar 09 16:36:58 crc kubenswrapper[4831]: I0309 16:36:58.448273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" event={"ID":"37b99364-3b2a-4004-96ab-eb074a3c642c","Type":"ContainerStarted","Data":"40c579afe82430c7c1a72b9b8329871941e04497a596d6946b4bdcc3022c201b"} Mar 09 16:36:58 crc kubenswrapper[4831]: I0309 16:36:58.467516 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" podStartSLOduration=2.467500196 podStartE2EDuration="2.467500196s" podCreationTimestamp="2026-03-09 16:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:36:58.465569331 +0000 UTC m=+2345.599251774" watchObservedRunningTime="2026-03-09 16:36:58.467500196 +0000 UTC m=+2345.601182619" Mar 09 16:36:59 crc kubenswrapper[4831]: I0309 16:36:59.456850 4831 generic.go:334] "Generic (PLEG): container finished" podID="37b99364-3b2a-4004-96ab-eb074a3c642c" containerID="5de2a78cd4a02d6f6f16c12e808e08684ea8582034c5b91977cf024017b45d62" exitCode=0 Mar 09 16:36:59 crc kubenswrapper[4831]: I0309 16:36:59.456961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" event={"ID":"37b99364-3b2a-4004-96ab-eb074a3c642c","Type":"ContainerDied","Data":"5de2a78cd4a02d6f6f16c12e808e08684ea8582034c5b91977cf024017b45d62"} Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.746314 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.786851 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv"] Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.792678 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv"] Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831556 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jw54\" (UniqueName: \"kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831680 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.831756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts\") pod \"37b99364-3b2a-4004-96ab-eb074a3c642c\" (UID: \"37b99364-3b2a-4004-96ab-eb074a3c642c\") " Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.832077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.832948 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.836642 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54" (OuterVolumeSpecName: "kube-api-access-2jw54") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "kube-api-access-2jw54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.851506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts" (OuterVolumeSpecName: "scripts") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.858861 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.863627 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "37b99364-3b2a-4004-96ab-eb074a3c642c" (UID: "37b99364-3b2a-4004-96ab-eb074a3c642c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933037 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933078 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jw54\" (UniqueName: \"kubernetes.io/projected/37b99364-3b2a-4004-96ab-eb074a3c642c-kube-api-access-2jw54\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933094 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933105 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37b99364-3b2a-4004-96ab-eb074a3c642c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933114 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37b99364-3b2a-4004-96ab-eb074a3c642c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:00 crc kubenswrapper[4831]: I0309 16:37:00.933124 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37b99364-3b2a-4004-96ab-eb074a3c642c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.476448 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c579afe82430c7c1a72b9b8329871941e04497a596d6946b4bdcc3022c201b" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.476530 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q7jwv" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.627115 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b99364-3b2a-4004-96ab-eb074a3c642c" path="/var/lib/kubelet/pods/37b99364-3b2a-4004-96ab-eb074a3c642c/volumes" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.935272 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f"] Mar 09 16:37:01 crc kubenswrapper[4831]: E0309 16:37:01.937002 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b99364-3b2a-4004-96ab-eb074a3c642c" containerName="swift-ring-rebalance" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.937045 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b99364-3b2a-4004-96ab-eb074a3c642c" containerName="swift-ring-rebalance" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.937267 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b99364-3b2a-4004-96ab-eb074a3c642c" containerName="swift-ring-rebalance" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.937925 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.941791 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.941917 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.944364 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f"] Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dds\" (UniqueName: \"kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946377 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946517 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946696 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:01 crc kubenswrapper[4831]: I0309 16:37:01.946749 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.047833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.048153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.048693 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.048802 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.048943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dds\" (UniqueName: \"kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.049051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.048609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.049347 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.050047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.054360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.056960 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.066173 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dds\" (UniqueName: \"kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds\") pod \"swift-ring-rebalance-debug-h2c5f\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.254897 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:02 crc kubenswrapper[4831]: I0309 16:37:02.483547 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f"] Mar 09 16:37:03 crc kubenswrapper[4831]: I0309 16:37:03.022366 4831 scope.go:117] "RemoveContainer" containerID="2f9ccb3260343621d9d5081fd934e3e76b6371f6a2c90c82f1789790ad814d52" Mar 09 16:37:03 crc kubenswrapper[4831]: I0309 16:37:03.087810 4831 scope.go:117] "RemoveContainer" containerID="b586d3d078dc2463c14548774b22f585d8b393a971888bb3146c57ee121957c3" Mar 09 16:37:03 crc kubenswrapper[4831]: I0309 16:37:03.496764 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" event={"ID":"dd7d2885-9b40-433e-9d80-ef50a43ec502","Type":"ContainerStarted","Data":"1eacd36498644e7f770140db1b15c6e7f262f3a8b54e0d8ba5974bf993d2a2c9"} Mar 09 16:37:03 crc kubenswrapper[4831]: I0309 16:37:03.496816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" event={"ID":"dd7d2885-9b40-433e-9d80-ef50a43ec502","Type":"ContainerStarted","Data":"40b430b4b20f5b23e553855f9417a89b97b25b5661f078f140032166582f4da1"} Mar 09 16:37:03 crc kubenswrapper[4831]: I0309 16:37:03.531149 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" podStartSLOduration=2.531123835 podStartE2EDuration="2.531123835s" podCreationTimestamp="2026-03-09 16:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:03.524006413 +0000 UTC m=+2350.657688866" watchObservedRunningTime="2026-03-09 16:37:03.531123835 +0000 UTC m=+2350.664806308" Mar 09 16:37:04 crc kubenswrapper[4831]: I0309 16:37:04.506906 4831 generic.go:334] "Generic (PLEG): container finished" podID="dd7d2885-9b40-433e-9d80-ef50a43ec502" containerID="1eacd36498644e7f770140db1b15c6e7f262f3a8b54e0d8ba5974bf993d2a2c9" exitCode=0 Mar 09 16:37:04 crc kubenswrapper[4831]: I0309 16:37:04.506968 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" event={"ID":"dd7d2885-9b40-433e-9d80-ef50a43ec502","Type":"ContainerDied","Data":"1eacd36498644e7f770140db1b15c6e7f262f3a8b54e0d8ba5974bf993d2a2c9"} Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.783015 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.812898 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.812937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.812972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.813013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.813065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8dds\" (UniqueName: \"kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.813082 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices\") pod \"dd7d2885-9b40-433e-9d80-ef50a43ec502\" (UID: \"dd7d2885-9b40-433e-9d80-ef50a43ec502\") " Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.813925 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.814514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.821897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds" (OuterVolumeSpecName: "kube-api-access-g8dds") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "kube-api-access-g8dds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.824471 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f"] Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.834634 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f"] Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.845587 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts" (OuterVolumeSpecName: "scripts") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.848226 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.848650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dd7d2885-9b40-433e-9d80-ef50a43ec502" (UID: "dd7d2885-9b40-433e-9d80-ef50a43ec502"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914270 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914308 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dd7d2885-9b40-433e-9d80-ef50a43ec502-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914317 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914327 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dd7d2885-9b40-433e-9d80-ef50a43ec502-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914336 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8dds\" (UniqueName: \"kubernetes.io/projected/dd7d2885-9b40-433e-9d80-ef50a43ec502-kube-api-access-g8dds\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:05 crc kubenswrapper[4831]: I0309 16:37:05.914346 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dd7d2885-9b40-433e-9d80-ef50a43ec502-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.531223 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b430b4b20f5b23e553855f9417a89b97b25b5661f078f140032166582f4da1" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.531300 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2c5f" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.975797 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-45c8x"] Mar 09 16:37:06 crc kubenswrapper[4831]: E0309 16:37:06.976386 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7d2885-9b40-433e-9d80-ef50a43ec502" containerName="swift-ring-rebalance" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.976467 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7d2885-9b40-433e-9d80-ef50a43ec502" containerName="swift-ring-rebalance" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.976657 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7d2885-9b40-433e-9d80-ef50a43ec502" containerName="swift-ring-rebalance" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.977322 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.980980 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.981174 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:06 crc kubenswrapper[4831]: I0309 16:37:06.984418 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-45c8x"] Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137315 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137786 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137839 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137868 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8dbd\" (UniqueName: \"kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.137912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239135 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239239 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8dbd\" (UniqueName: \"kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239346 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.239799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.240183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.240281 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.243676 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.245129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.258299 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8dbd\" (UniqueName: \"kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd\") pod \"swift-ring-rebalance-debug-45c8x\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.294142 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.627895 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7d2885-9b40-433e-9d80-ef50a43ec502" path="/var/lib/kubelet/pods/dd7d2885-9b40-433e-9d80-ef50a43ec502/volumes" Mar 09 16:37:07 crc kubenswrapper[4831]: I0309 16:37:07.725570 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-45c8x"] Mar 09 16:37:08 crc kubenswrapper[4831]: I0309 16:37:08.547934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" event={"ID":"300ba773-3c30-470b-b758-3acf9fe876a5","Type":"ContainerStarted","Data":"3aa6b55d3123155335bb8b069017cf8ca4f65e9b70655a8cd396be081867c61a"} Mar 09 16:37:08 crc kubenswrapper[4831]: I0309 16:37:08.548430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" event={"ID":"300ba773-3c30-470b-b758-3acf9fe876a5","Type":"ContainerStarted","Data":"cabf0f7850298c46ee5a2e6a12d0d6c49f663f49538d20eb58df4fceefe54f9c"} Mar 09 16:37:08 crc kubenswrapper[4831]: I0309 16:37:08.572115 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" podStartSLOduration=2.572090101 podStartE2EDuration="2.572090101s" podCreationTimestamp="2026-03-09 16:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:08.56498068 +0000 UTC m=+2355.698663103" watchObservedRunningTime="2026-03-09 16:37:08.572090101 +0000 UTC m=+2355.705772544" Mar 09 16:37:09 crc kubenswrapper[4831]: I0309 16:37:09.560777 4831 generic.go:334] "Generic (PLEG): container finished" podID="300ba773-3c30-470b-b758-3acf9fe876a5" containerID="3aa6b55d3123155335bb8b069017cf8ca4f65e9b70655a8cd396be081867c61a" exitCode=0 Mar 09 16:37:09 crc kubenswrapper[4831]: I0309 16:37:09.560850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" event={"ID":"300ba773-3c30-470b-b758-3acf9fe876a5","Type":"ContainerDied","Data":"3aa6b55d3123155335bb8b069017cf8ca4f65e9b70655a8cd396be081867c61a"} Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.858670 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.894942 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-45c8x"] Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.900563 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-45c8x"] Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999266 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8dbd\" (UniqueName: \"kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999321 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999356 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999422 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:10 crc kubenswrapper[4831]: I0309 16:37:10.999444 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices\") pod \"300ba773-3c30-470b-b758-3acf9fe876a5\" (UID: \"300ba773-3c30-470b-b758-3acf9fe876a5\") " Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.000198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.000344 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.004557 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd" (OuterVolumeSpecName: "kube-api-access-q8dbd") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "kube-api-access-q8dbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.021382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.023491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts" (OuterVolumeSpecName: "scripts") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.025077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "300ba773-3c30-470b-b758-3acf9fe876a5" (UID: "300ba773-3c30-470b-b758-3acf9fe876a5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.100932 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.100980 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.100997 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/300ba773-3c30-470b-b758-3acf9fe876a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.101012 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8dbd\" (UniqueName: \"kubernetes.io/projected/300ba773-3c30-470b-b758-3acf9fe876a5-kube-api-access-q8dbd\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.101025 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/300ba773-3c30-470b-b758-3acf9fe876a5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.101036 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/300ba773-3c30-470b-b758-3acf9fe876a5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.586068 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cabf0f7850298c46ee5a2e6a12d0d6c49f663f49538d20eb58df4fceefe54f9c" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.586132 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-45c8x" Mar 09 16:37:11 crc kubenswrapper[4831]: I0309 16:37:11.628770 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300ba773-3c30-470b-b758-3acf9fe876a5" path="/var/lib/kubelet/pods/300ba773-3c30-470b-b758-3acf9fe876a5/volumes" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.032479 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cbrch"] Mar 09 16:37:12 crc kubenswrapper[4831]: E0309 16:37:12.037363 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300ba773-3c30-470b-b758-3acf9fe876a5" containerName="swift-ring-rebalance" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.037387 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="300ba773-3c30-470b-b758-3acf9fe876a5" containerName="swift-ring-rebalance" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.037614 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="300ba773-3c30-470b-b758-3acf9fe876a5" containerName="swift-ring-rebalance" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.038131 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.040249 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.041505 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.044980 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cbrch"] Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.215796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs9hf\" (UniqueName: \"kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.216045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.216315 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.216364 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.216496 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.216538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318222 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs9hf\" (UniqueName: \"kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318316 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318336 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.318357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.319005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.319161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.319615 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.323433 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.329021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.333631 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs9hf\" (UniqueName: \"kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf\") pod \"swift-ring-rebalance-debug-cbrch\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.356009 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.551347 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cbrch"] Mar 09 16:37:12 crc kubenswrapper[4831]: W0309 16:37:12.557666 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94be8de9_1ec7_416a_a64c_124898099707.slice/crio-93eae4be29c0cf75292feb9de4a28c80c266cd265e1fcc3a3503f54909cfbbc8 WatchSource:0}: Error finding container 93eae4be29c0cf75292feb9de4a28c80c266cd265e1fcc3a3503f54909cfbbc8: Status 404 returned error can't find the container with id 93eae4be29c0cf75292feb9de4a28c80c266cd265e1fcc3a3503f54909cfbbc8 Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.596432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" event={"ID":"94be8de9-1ec7-416a-a64c-124898099707","Type":"ContainerStarted","Data":"93eae4be29c0cf75292feb9de4a28c80c266cd265e1fcc3a3503f54909cfbbc8"} Mar 09 16:37:12 crc kubenswrapper[4831]: I0309 16:37:12.618524 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:37:12 crc kubenswrapper[4831]: E0309 16:37:12.618890 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:37:13 crc kubenswrapper[4831]: I0309 16:37:13.607046 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" event={"ID":"94be8de9-1ec7-416a-a64c-124898099707","Type":"ContainerStarted","Data":"605adcaacd35d69fcc293b12923edb09fc03cb9ad5499ae2da8af83833c7dd70"} Mar 09 16:37:13 crc kubenswrapper[4831]: I0309 16:37:13.660633 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" podStartSLOduration=1.660607846 podStartE2EDuration="1.660607846s" podCreationTimestamp="2026-03-09 16:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:13.650598662 +0000 UTC m=+2360.784281085" watchObservedRunningTime="2026-03-09 16:37:13.660607846 +0000 UTC m=+2360.794290279" Mar 09 16:37:14 crc kubenswrapper[4831]: I0309 16:37:14.622803 4831 generic.go:334] "Generic (PLEG): container finished" podID="94be8de9-1ec7-416a-a64c-124898099707" containerID="605adcaacd35d69fcc293b12923edb09fc03cb9ad5499ae2da8af83833c7dd70" exitCode=0 Mar 09 16:37:14 crc kubenswrapper[4831]: I0309 16:37:14.622863 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" event={"ID":"94be8de9-1ec7-416a-a64c-124898099707","Type":"ContainerDied","Data":"605adcaacd35d69fcc293b12923edb09fc03cb9ad5499ae2da8af83833c7dd70"} Mar 09 16:37:15 crc kubenswrapper[4831]: I0309 16:37:15.923025 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:15 crc kubenswrapper[4831]: I0309 16:37:15.960146 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cbrch"] Mar 09 16:37:15 crc kubenswrapper[4831]: I0309 16:37:15.966251 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cbrch"] Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.085914 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.085998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.086031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.086143 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs9hf\" (UniqueName: \"kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.086175 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.086204 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift\") pod \"94be8de9-1ec7-416a-a64c-124898099707\" (UID: \"94be8de9-1ec7-416a-a64c-124898099707\") " Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.087079 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.087321 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.087575 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94be8de9-1ec7-416a-a64c-124898099707-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.087604 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.091920 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf" (OuterVolumeSpecName: "kube-api-access-hs9hf") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "kube-api-access-hs9hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.113246 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.120286 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts" (OuterVolumeSpecName: "scripts") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.122389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "94be8de9-1ec7-416a-a64c-124898099707" (UID: "94be8de9-1ec7-416a-a64c-124898099707"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.188989 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.189039 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94be8de9-1ec7-416a-a64c-124898099707-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.189055 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs9hf\" (UniqueName: \"kubernetes.io/projected/94be8de9-1ec7-416a-a64c-124898099707-kube-api-access-hs9hf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.189074 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94be8de9-1ec7-416a-a64c-124898099707-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.639889 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93eae4be29c0cf75292feb9de4a28c80c266cd265e1fcc3a3503f54909cfbbc8" Mar 09 16:37:16 crc kubenswrapper[4831]: I0309 16:37:16.639970 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cbrch" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.078647 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dznmx"] Mar 09 16:37:17 crc kubenswrapper[4831]: E0309 16:37:17.079858 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94be8de9-1ec7-416a-a64c-124898099707" containerName="swift-ring-rebalance" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.079999 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="94be8de9-1ec7-416a-a64c-124898099707" containerName="swift-ring-rebalance" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.080219 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="94be8de9-1ec7-416a-a64c-124898099707" containerName="swift-ring-rebalance" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.081059 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.083462 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.083473 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.090133 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dznmx"] Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102207 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102560 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25zbq\" (UniqueName: \"kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102670 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102764 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.102915 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.203471 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.203803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.203824 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.204598 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25zbq\" (UniqueName: \"kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.204183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.204534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.204683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.204823 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.205153 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.209049 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.210031 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.220172 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25zbq\" (UniqueName: \"kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq\") pod \"swift-ring-rebalance-debug-dznmx\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.396524 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.628346 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94be8de9-1ec7-416a-a64c-124898099707" path="/var/lib/kubelet/pods/94be8de9-1ec7-416a-a64c-124898099707/volumes" Mar 09 16:37:17 crc kubenswrapper[4831]: I0309 16:37:17.853906 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dznmx"] Mar 09 16:37:18 crc kubenswrapper[4831]: I0309 16:37:18.656417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" event={"ID":"f976799f-fc03-46d0-a5f8-2356b6165b5f","Type":"ContainerStarted","Data":"d79683b0f87df73add8cb0aed22dd638bd7e2faeea01b111120b349a8bbdf707"} Mar 09 16:37:18 crc kubenswrapper[4831]: I0309 16:37:18.656788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" event={"ID":"f976799f-fc03-46d0-a5f8-2356b6165b5f","Type":"ContainerStarted","Data":"c30c3e83b0c96e9b8f3deb078f0439b12054112d270352b155bccc69ba2d67ed"} Mar 09 16:37:18 crc kubenswrapper[4831]: I0309 16:37:18.675860 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" podStartSLOduration=1.6758359619999998 podStartE2EDuration="1.675835962s" podCreationTimestamp="2026-03-09 16:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:18.670081049 +0000 UTC m=+2365.803763472" watchObservedRunningTime="2026-03-09 16:37:18.675835962 +0000 UTC m=+2365.809518385" Mar 09 16:37:19 crc kubenswrapper[4831]: I0309 16:37:19.664520 4831 generic.go:334] "Generic (PLEG): container finished" podID="f976799f-fc03-46d0-a5f8-2356b6165b5f" containerID="d79683b0f87df73add8cb0aed22dd638bd7e2faeea01b111120b349a8bbdf707" exitCode=0 Mar 09 16:37:19 crc kubenswrapper[4831]: I0309 16:37:19.664597 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" event={"ID":"f976799f-fc03-46d0-a5f8-2356b6165b5f","Type":"ContainerDied","Data":"d79683b0f87df73add8cb0aed22dd638bd7e2faeea01b111120b349a8bbdf707"} Mar 09 16:37:20 crc kubenswrapper[4831]: I0309 16:37:20.997224 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.024027 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dznmx"] Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.037634 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dznmx"] Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.166777 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.166842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25zbq\" (UniqueName: \"kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.166879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.166922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.166984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.167106 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices\") pod \"f976799f-fc03-46d0-a5f8-2356b6165b5f\" (UID: \"f976799f-fc03-46d0-a5f8-2356b6165b5f\") " Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.167626 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.167756 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.172712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq" (OuterVolumeSpecName: "kube-api-access-25zbq") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "kube-api-access-25zbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.188968 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.189740 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.193289 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts" (OuterVolumeSpecName: "scripts") pod "f976799f-fc03-46d0-a5f8-2356b6165b5f" (UID: "f976799f-fc03-46d0-a5f8-2356b6165b5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.268944 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25zbq\" (UniqueName: \"kubernetes.io/projected/f976799f-fc03-46d0-a5f8-2356b6165b5f-kube-api-access-25zbq\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.268986 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.268995 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.269003 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f976799f-fc03-46d0-a5f8-2356b6165b5f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.269012 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f976799f-fc03-46d0-a5f8-2356b6165b5f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.269020 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f976799f-fc03-46d0-a5f8-2356b6165b5f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.628028 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f976799f-fc03-46d0-a5f8-2356b6165b5f" path="/var/lib/kubelet/pods/f976799f-fc03-46d0-a5f8-2356b6165b5f/volumes" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.679492 4831 scope.go:117] "RemoveContainer" containerID="d79683b0f87df73add8cb0aed22dd638bd7e2faeea01b111120b349a8bbdf707" Mar 09 16:37:21 crc kubenswrapper[4831]: I0309 16:37:21.679548 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dznmx" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.153515 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6"] Mar 09 16:37:22 crc kubenswrapper[4831]: E0309 16:37:22.153808 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f976799f-fc03-46d0-a5f8-2356b6165b5f" containerName="swift-ring-rebalance" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.153819 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f976799f-fc03-46d0-a5f8-2356b6165b5f" containerName="swift-ring-rebalance" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.153956 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f976799f-fc03-46d0-a5f8-2356b6165b5f" containerName="swift-ring-rebalance" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.154513 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.156626 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.157552 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.165135 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6"] Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.290522 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.290776 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9j8\" (UniqueName: \"kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.290809 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.290841 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.290891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.291038 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392381 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392514 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392540 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9j8\" (UniqueName: \"kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392570 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.392664 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.393359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.393553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.393772 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.397703 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.398785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.408336 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9j8\" (UniqueName: \"kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8\") pod \"swift-ring-rebalance-debug-6qgl6\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.471088 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:22 crc kubenswrapper[4831]: I0309 16:37:22.890257 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6"] Mar 09 16:37:23 crc kubenswrapper[4831]: I0309 16:37:23.623191 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:37:23 crc kubenswrapper[4831]: E0309 16:37:23.623782 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:37:23 crc kubenswrapper[4831]: I0309 16:37:23.697797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" event={"ID":"b64b0be5-828f-4a66-8bae-66edaa2c740a","Type":"ContainerStarted","Data":"ac8b006ae1625d209a9d16da32ed0c840a4fe96f726a777ba009bba858dc29f6"} Mar 09 16:37:23 crc kubenswrapper[4831]: I0309 16:37:23.697846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" event={"ID":"b64b0be5-828f-4a66-8bae-66edaa2c740a","Type":"ContainerStarted","Data":"e339c814d8e0ef0285cf93ea07274a54e3b83d279a0a1de9a575378921ab16d3"} Mar 09 16:37:23 crc kubenswrapper[4831]: I0309 16:37:23.716860 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" podStartSLOduration=1.716839019 podStartE2EDuration="1.716839019s" podCreationTimestamp="2026-03-09 16:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:23.713946027 +0000 UTC m=+2370.847628460" watchObservedRunningTime="2026-03-09 16:37:23.716839019 +0000 UTC m=+2370.850521442" Mar 09 16:37:24 crc kubenswrapper[4831]: I0309 16:37:24.708428 4831 generic.go:334] "Generic (PLEG): container finished" podID="b64b0be5-828f-4a66-8bae-66edaa2c740a" containerID="ac8b006ae1625d209a9d16da32ed0c840a4fe96f726a777ba009bba858dc29f6" exitCode=0 Mar 09 16:37:24 crc kubenswrapper[4831]: I0309 16:37:24.708523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" event={"ID":"b64b0be5-828f-4a66-8bae-66edaa2c740a","Type":"ContainerDied","Data":"ac8b006ae1625d209a9d16da32ed0c840a4fe96f726a777ba009bba858dc29f6"} Mar 09 16:37:25 crc kubenswrapper[4831]: I0309 16:37:25.988078 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.025812 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6"] Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.031285 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6"] Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154356 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154415 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154435 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm9j8\" (UniqueName: \"kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154614 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.154666 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts\") pod \"b64b0be5-828f-4a66-8bae-66edaa2c740a\" (UID: \"b64b0be5-828f-4a66-8bae-66edaa2c740a\") " Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.155240 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.155941 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.159138 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8" (OuterVolumeSpecName: "kube-api-access-rm9j8") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "kube-api-access-rm9j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.175119 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts" (OuterVolumeSpecName: "scripts") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.176204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.191422 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b64b0be5-828f-4a66-8bae-66edaa2c740a" (UID: "b64b0be5-828f-4a66-8bae-66edaa2c740a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256163 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256193 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64b0be5-828f-4a66-8bae-66edaa2c740a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256205 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b64b0be5-828f-4a66-8bae-66edaa2c740a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256214 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256222 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b64b0be5-828f-4a66-8bae-66edaa2c740a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.256232 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm9j8\" (UniqueName: \"kubernetes.io/projected/b64b0be5-828f-4a66-8bae-66edaa2c740a-kube-api-access-rm9j8\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.736200 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e339c814d8e0ef0285cf93ea07274a54e3b83d279a0a1de9a575378921ab16d3" Mar 09 16:37:26 crc kubenswrapper[4831]: I0309 16:37:26.736273 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6qgl6" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.152899 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x"] Mar 09 16:37:27 crc kubenswrapper[4831]: E0309 16:37:27.153220 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64b0be5-828f-4a66-8bae-66edaa2c740a" containerName="swift-ring-rebalance" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.153237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64b0be5-828f-4a66-8bae-66edaa2c740a" containerName="swift-ring-rebalance" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.153437 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64b0be5-828f-4a66-8bae-66edaa2c740a" containerName="swift-ring-rebalance" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.153973 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.156492 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.156987 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.165132 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x"] Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269504 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269609 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.269713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9wkh\" (UniqueName: \"kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.371686 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.371858 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.371944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.371983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.372052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.372130 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkh\" (UniqueName: \"kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.373451 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.374454 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.374532 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.378746 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.379327 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.391875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9wkh\" (UniqueName: \"kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh\") pod \"swift-ring-rebalance-debug-xkk9x\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.473817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.629056 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b64b0be5-828f-4a66-8bae-66edaa2c740a" path="/var/lib/kubelet/pods/b64b0be5-828f-4a66-8bae-66edaa2c740a/volumes" Mar 09 16:37:27 crc kubenswrapper[4831]: I0309 16:37:27.906658 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x"] Mar 09 16:37:27 crc kubenswrapper[4831]: W0309 16:37:27.919601 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09126a7c_849a_4b77_902f_acf487009ed3.slice/crio-1c7f79bc6b8180865f28d1b1483944ad17bf2512edb6f61649b5ade7bcdfa6fe WatchSource:0}: Error finding container 1c7f79bc6b8180865f28d1b1483944ad17bf2512edb6f61649b5ade7bcdfa6fe: Status 404 returned error can't find the container with id 1c7f79bc6b8180865f28d1b1483944ad17bf2512edb6f61649b5ade7bcdfa6fe Mar 09 16:37:28 crc kubenswrapper[4831]: I0309 16:37:28.766617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" event={"ID":"09126a7c-849a-4b77-902f-acf487009ed3","Type":"ContainerStarted","Data":"9416e06e0b9bde3ac2bcd74f6643ef14f6e5ff931a5afa08f6e37c9e31f9c7b2"} Mar 09 16:37:28 crc kubenswrapper[4831]: I0309 16:37:28.766675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" event={"ID":"09126a7c-849a-4b77-902f-acf487009ed3","Type":"ContainerStarted","Data":"1c7f79bc6b8180865f28d1b1483944ad17bf2512edb6f61649b5ade7bcdfa6fe"} Mar 09 16:37:28 crc kubenswrapper[4831]: I0309 16:37:28.785099 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" podStartSLOduration=1.785074679 podStartE2EDuration="1.785074679s" podCreationTimestamp="2026-03-09 16:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:28.784515293 +0000 UTC m=+2375.918197716" watchObservedRunningTime="2026-03-09 16:37:28.785074679 +0000 UTC m=+2375.918757122" Mar 09 16:37:29 crc kubenswrapper[4831]: I0309 16:37:29.781098 4831 generic.go:334] "Generic (PLEG): container finished" podID="09126a7c-849a-4b77-902f-acf487009ed3" containerID="9416e06e0b9bde3ac2bcd74f6643ef14f6e5ff931a5afa08f6e37c9e31f9c7b2" exitCode=0 Mar 09 16:37:29 crc kubenswrapper[4831]: I0309 16:37:29.781219 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" event={"ID":"09126a7c-849a-4b77-902f-acf487009ed3","Type":"ContainerDied","Data":"9416e06e0b9bde3ac2bcd74f6643ef14f6e5ff931a5afa08f6e37c9e31f9c7b2"} Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.122106 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.162454 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x"] Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.170366 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x"] Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250097 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9wkh\" (UniqueName: \"kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250293 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250333 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.250529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices\") pod \"09126a7c-849a-4b77-902f-acf487009ed3\" (UID: \"09126a7c-849a-4b77-902f-acf487009ed3\") " Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.251516 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.251797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.257602 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh" (OuterVolumeSpecName: "kube-api-access-x9wkh") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "kube-api-access-x9wkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.283240 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.285151 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.292818 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts" (OuterVolumeSpecName: "scripts") pod "09126a7c-849a-4b77-902f-acf487009ed3" (UID: "09126a7c-849a-4b77-902f-acf487009ed3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352565 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352599 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352610 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9wkh\" (UniqueName: \"kubernetes.io/projected/09126a7c-849a-4b77-902f-acf487009ed3-kube-api-access-x9wkh\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352621 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09126a7c-849a-4b77-902f-acf487009ed3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352630 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09126a7c-849a-4b77-902f-acf487009ed3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.352638 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09126a7c-849a-4b77-902f-acf487009ed3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.632006 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09126a7c-849a-4b77-902f-acf487009ed3" path="/var/lib/kubelet/pods/09126a7c-849a-4b77-902f-acf487009ed3/volumes" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.804559 4831 scope.go:117] "RemoveContainer" containerID="9416e06e0b9bde3ac2bcd74f6643ef14f6e5ff931a5afa08f6e37c9e31f9c7b2" Mar 09 16:37:31 crc kubenswrapper[4831]: I0309 16:37:31.804623 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xkk9x" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.315008 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-24d6t"] Mar 09 16:37:32 crc kubenswrapper[4831]: E0309 16:37:32.315704 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09126a7c-849a-4b77-902f-acf487009ed3" containerName="swift-ring-rebalance" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.315722 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="09126a7c-849a-4b77-902f-acf487009ed3" containerName="swift-ring-rebalance" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.315890 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="09126a7c-849a-4b77-902f-acf487009ed3" containerName="swift-ring-rebalance" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.316352 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.319527 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.320229 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.328889 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-24d6t"] Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.368600 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.368648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7klq\" (UniqueName: \"kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.368789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.369032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.369142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.369184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7klq\" (UniqueName: \"kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.470431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.471165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.471239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.471257 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.474854 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.475972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.499528 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7klq\" (UniqueName: \"kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq\") pod \"swift-ring-rebalance-debug-24d6t\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.639052 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:32 crc kubenswrapper[4831]: I0309 16:37:32.935522 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-24d6t"] Mar 09 16:37:32 crc kubenswrapper[4831]: W0309 16:37:32.939235 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae369e9_6ece_46b8_8bb4_3a1ddebc2f00.slice/crio-42be3935b63cd3464fa997be8a173dac10a02ba21f52eaed299724272bf3e46f WatchSource:0}: Error finding container 42be3935b63cd3464fa997be8a173dac10a02ba21f52eaed299724272bf3e46f: Status 404 returned error can't find the container with id 42be3935b63cd3464fa997be8a173dac10a02ba21f52eaed299724272bf3e46f Mar 09 16:37:33 crc kubenswrapper[4831]: I0309 16:37:33.826990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" event={"ID":"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00","Type":"ContainerStarted","Data":"c1459bb063865602fde9243117afb48ea0c9849d0f17acc9809de9bc63f99e5b"} Mar 09 16:37:33 crc kubenswrapper[4831]: I0309 16:37:33.827278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" event={"ID":"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00","Type":"ContainerStarted","Data":"42be3935b63cd3464fa997be8a173dac10a02ba21f52eaed299724272bf3e46f"} Mar 09 16:37:33 crc kubenswrapper[4831]: I0309 16:37:33.850696 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" podStartSLOduration=1.850670094 podStartE2EDuration="1.850670094s" podCreationTimestamp="2026-03-09 16:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:33.842334607 +0000 UTC m=+2380.976017050" watchObservedRunningTime="2026-03-09 16:37:33.850670094 +0000 UTC m=+2380.984352517" Mar 09 16:37:34 crc kubenswrapper[4831]: I0309 16:37:34.836166 4831 generic.go:334] "Generic (PLEG): container finished" podID="9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" containerID="c1459bb063865602fde9243117afb48ea0c9849d0f17acc9809de9bc63f99e5b" exitCode=0 Mar 09 16:37:34 crc kubenswrapper[4831]: I0309 16:37:34.836225 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" event={"ID":"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00","Type":"ContainerDied","Data":"c1459bb063865602fde9243117afb48ea0c9849d0f17acc9809de9bc63f99e5b"} Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.113089 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.153523 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-24d6t"] Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.158933 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-24d6t"] Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7klq\" (UniqueName: \"kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224306 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224445 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.224489 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf\") pod \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\" (UID: \"9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00\") " Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.225182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.225469 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.229423 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq" (OuterVolumeSpecName: "kube-api-access-d7klq") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "kube-api-access-d7klq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.249160 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.251381 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.255260 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts" (OuterVolumeSpecName: "scripts") pod "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" (UID: "9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326451 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7klq\" (UniqueName: \"kubernetes.io/projected/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-kube-api-access-d7klq\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326490 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326507 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326519 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326532 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.326564 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.854155 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42be3935b63cd3464fa997be8a173dac10a02ba21f52eaed299724272bf3e46f" Mar 09 16:37:36 crc kubenswrapper[4831]: I0309 16:37:36.854218 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-24d6t" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.301936 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhzch"] Mar 09 16:37:37 crc kubenswrapper[4831]: E0309 16:37:37.302275 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" containerName="swift-ring-rebalance" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.302288 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" containerName="swift-ring-rebalance" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.302453 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" containerName="swift-ring-rebalance" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.302950 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.306196 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.306230 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.314456 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhzch"] Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.441676 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.441762 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.441795 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jwz\" (UniqueName: \"kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.441814 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.441833 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.442084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543577 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jwz\" (UniqueName: \"kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543677 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543712 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.543807 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.544315 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.544539 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.544583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.548927 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.549017 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.570132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jwz\" (UniqueName: \"kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz\") pod \"swift-ring-rebalance-debug-jhzch\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.617923 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:37:37 crc kubenswrapper[4831]: E0309 16:37:37.618157 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.633368 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00" path="/var/lib/kubelet/pods/9ae369e9-6ece-46b8-8bb4-3a1ddebc2f00/volumes" Mar 09 16:37:37 crc kubenswrapper[4831]: I0309 16:37:37.640056 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:38 crc kubenswrapper[4831]: I0309 16:37:38.077863 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhzch"] Mar 09 16:37:38 crc kubenswrapper[4831]: W0309 16:37:38.086576 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcbed267_4fda_461c_9924_0ca7d2a089b1.slice/crio-103708446e56f05cb8c7c54c96daf328ed5c0e7b1eb6d7cb0f92f54846c14be4 WatchSource:0}: Error finding container 103708446e56f05cb8c7c54c96daf328ed5c0e7b1eb6d7cb0f92f54846c14be4: Status 404 returned error can't find the container with id 103708446e56f05cb8c7c54c96daf328ed5c0e7b1eb6d7cb0f92f54846c14be4 Mar 09 16:37:38 crc kubenswrapper[4831]: I0309 16:37:38.873320 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" event={"ID":"dcbed267-4fda-461c-9924-0ca7d2a089b1","Type":"ContainerStarted","Data":"8786b7a534e3dcdb64b0343a830051d7fd070779f84d3e54f8d66df1324924bf"} Mar 09 16:37:38 crc kubenswrapper[4831]: I0309 16:37:38.874863 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" event={"ID":"dcbed267-4fda-461c-9924-0ca7d2a089b1","Type":"ContainerStarted","Data":"103708446e56f05cb8c7c54c96daf328ed5c0e7b1eb6d7cb0f92f54846c14be4"} Mar 09 16:37:38 crc kubenswrapper[4831]: I0309 16:37:38.897008 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" podStartSLOduration=1.896990772 podStartE2EDuration="1.896990772s" podCreationTimestamp="2026-03-09 16:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:38.889715176 +0000 UTC m=+2386.023397619" watchObservedRunningTime="2026-03-09 16:37:38.896990772 +0000 UTC m=+2386.030673195" Mar 09 16:37:39 crc kubenswrapper[4831]: I0309 16:37:39.889100 4831 generic.go:334] "Generic (PLEG): container finished" podID="dcbed267-4fda-461c-9924-0ca7d2a089b1" containerID="8786b7a534e3dcdb64b0343a830051d7fd070779f84d3e54f8d66df1324924bf" exitCode=0 Mar 09 16:37:39 crc kubenswrapper[4831]: I0309 16:37:39.889147 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" event={"ID":"dcbed267-4fda-461c-9924-0ca7d2a089b1","Type":"ContainerDied","Data":"8786b7a534e3dcdb64b0343a830051d7fd070779f84d3e54f8d66df1324924bf"} Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.329194 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.331770 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.345548 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.491561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.491614 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.491655 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpfs\" (UniqueName: \"kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.593286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.593341 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.593386 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpfs\" (UniqueName: \"kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.593848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.594204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.619464 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpfs\" (UniqueName: \"kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs\") pod \"certified-operators-276cz\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:40 crc kubenswrapper[4831]: I0309 16:37:40.656464 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:41 crc kubenswrapper[4831]: W0309 16:37:41.225087 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9fb4ed2_406c_45c5_b566_a4a26df9fd2b.slice/crio-0971fa96565883b05b2db58e071ba30fd6d40eae4e26047fc3e8cfb94c0f10c0 WatchSource:0}: Error finding container 0971fa96565883b05b2db58e071ba30fd6d40eae4e26047fc3e8cfb94c0f10c0: Status 404 returned error can't find the container with id 0971fa96565883b05b2db58e071ba30fd6d40eae4e26047fc3e8cfb94c0f10c0 Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.227876 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.241253 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.270793 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhzch"] Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.282262 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhzch"] Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404500 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404643 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7jwz\" (UniqueName: \"kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.404766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift\") pod \"dcbed267-4fda-461c-9924-0ca7d2a089b1\" (UID: \"dcbed267-4fda-461c-9924-0ca7d2a089b1\") " Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.405777 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.406525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.412831 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz" (OuterVolumeSpecName: "kube-api-access-s7jwz") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "kube-api-access-s7jwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.428976 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts" (OuterVolumeSpecName: "scripts") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.431586 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.434575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dcbed267-4fda-461c-9924-0ca7d2a089b1" (UID: "dcbed267-4fda-461c-9924-0ca7d2a089b1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506835 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506870 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7jwz\" (UniqueName: \"kubernetes.io/projected/dcbed267-4fda-461c-9924-0ca7d2a089b1-kube-api-access-s7jwz\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506888 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506900 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcbed267-4fda-461c-9924-0ca7d2a089b1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506910 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcbed267-4fda-461c-9924-0ca7d2a089b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.506922 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcbed267-4fda-461c-9924-0ca7d2a089b1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.643979 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcbed267-4fda-461c-9924-0ca7d2a089b1" path="/var/lib/kubelet/pods/dcbed267-4fda-461c-9924-0ca7d2a089b1/volumes" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.913445 4831 scope.go:117] "RemoveContainer" containerID="8786b7a534e3dcdb64b0343a830051d7fd070779f84d3e54f8d66df1324924bf" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.913520 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhzch" Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.916815 4831 generic.go:334] "Generic (PLEG): container finished" podID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerID="f8f67180f107fdd184124a3a55ac09c72f837e6d09c78a23209be1fc8626e1ef" exitCode=0 Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.916882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerDied","Data":"f8f67180f107fdd184124a3a55ac09c72f837e6d09c78a23209be1fc8626e1ef"} Mar 09 16:37:41 crc kubenswrapper[4831]: I0309 16:37:41.916927 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerStarted","Data":"0971fa96565883b05b2db58e071ba30fd6d40eae4e26047fc3e8cfb94c0f10c0"} Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.436046 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7"] Mar 09 16:37:42 crc kubenswrapper[4831]: E0309 16:37:42.436503 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbed267-4fda-461c-9924-0ca7d2a089b1" containerName="swift-ring-rebalance" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.436521 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbed267-4fda-461c-9924-0ca7d2a089b1" containerName="swift-ring-rebalance" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.436687 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbed267-4fda-461c-9924-0ca7d2a089b1" containerName="swift-ring-rebalance" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.437485 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.439688 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.440964 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.443822 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7"] Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521301 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521327 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.521381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnsxn\" (UniqueName: \"kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622244 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622282 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnsxn\" (UniqueName: \"kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.622440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.623039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.623215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.623474 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.627046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.627225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.650100 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnsxn\" (UniqueName: \"kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn\") pod \"swift-ring-rebalance-debug-4s7m7\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:42 crc kubenswrapper[4831]: I0309 16:37:42.760355 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.181254 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7"] Mar 09 16:37:43 crc kubenswrapper[4831]: W0309 16:37:43.249258 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2358fd8_739e_4340_8401_29f7efbb3c58.slice/crio-0937fb7d2b548dab8ee8fc4192a08e6fdde86decd36dddca66b7719754cd319f WatchSource:0}: Error finding container 0937fb7d2b548dab8ee8fc4192a08e6fdde86decd36dddca66b7719754cd319f: Status 404 returned error can't find the container with id 0937fb7d2b548dab8ee8fc4192a08e6fdde86decd36dddca66b7719754cd319f Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.943887 4831 generic.go:334] "Generic (PLEG): container finished" podID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerID="f2dc2f314581ffb67d483c238987cc52b430aa1519cfe0d0f0e16406a2f3e6ba" exitCode=0 Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.944009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerDied","Data":"f2dc2f314581ffb67d483c238987cc52b430aa1519cfe0d0f0e16406a2f3e6ba"} Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.948045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" event={"ID":"a2358fd8-739e-4340-8401-29f7efbb3c58","Type":"ContainerStarted","Data":"679bfb8e7a0619002244f946d608b35a121c4200895b41b031857bf4361ba88a"} Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.948088 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" event={"ID":"a2358fd8-739e-4340-8401-29f7efbb3c58","Type":"ContainerStarted","Data":"0937fb7d2b548dab8ee8fc4192a08e6fdde86decd36dddca66b7719754cd319f"} Mar 09 16:37:43 crc kubenswrapper[4831]: I0309 16:37:43.992993 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" podStartSLOduration=1.9929718699999999 podStartE2EDuration="1.99297187s" podCreationTimestamp="2026-03-09 16:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:43.990299354 +0000 UTC m=+2391.123981787" watchObservedRunningTime="2026-03-09 16:37:43.99297187 +0000 UTC m=+2391.126654293" Mar 09 16:37:46 crc kubenswrapper[4831]: I0309 16:37:46.097185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerStarted","Data":"3f83a8a0e64182f67fa64f0da1491c6f8f29f58a32091b272e197c6d7c182324"} Mar 09 16:37:46 crc kubenswrapper[4831]: I0309 16:37:46.099649 4831 generic.go:334] "Generic (PLEG): container finished" podID="a2358fd8-739e-4340-8401-29f7efbb3c58" containerID="679bfb8e7a0619002244f946d608b35a121c4200895b41b031857bf4361ba88a" exitCode=0 Mar 09 16:37:46 crc kubenswrapper[4831]: I0309 16:37:46.099681 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" event={"ID":"a2358fd8-739e-4340-8401-29f7efbb3c58","Type":"ContainerDied","Data":"679bfb8e7a0619002244f946d608b35a121c4200895b41b031857bf4361ba88a"} Mar 09 16:37:46 crc kubenswrapper[4831]: I0309 16:37:46.118572 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-276cz" podStartSLOduration=2.968316355 podStartE2EDuration="6.118557432s" podCreationTimestamp="2026-03-09 16:37:40 +0000 UTC" firstStartedPulling="2026-03-09 16:37:41.919068143 +0000 UTC m=+2389.052750616" lastFinishedPulling="2026-03-09 16:37:45.06930927 +0000 UTC m=+2392.202991693" observedRunningTime="2026-03-09 16:37:46.116155234 +0000 UTC m=+2393.249837657" watchObservedRunningTime="2026-03-09 16:37:46.118557432 +0000 UTC m=+2393.252239855" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.434663 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.464892 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7"] Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.483631 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7"] Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507623 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507658 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnsxn\" (UniqueName: \"kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507690 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507789 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.507883 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices\") pod \"a2358fd8-739e-4340-8401-29f7efbb3c58\" (UID: \"a2358fd8-739e-4340-8401-29f7efbb3c58\") " Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.508510 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.508554 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.512518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn" (OuterVolumeSpecName: "kube-api-access-fnsxn") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "kube-api-access-fnsxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.527131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts" (OuterVolumeSpecName: "scripts") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.530115 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.533430 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a2358fd8-739e-4340-8401-29f7efbb3c58" (UID: "a2358fd8-739e-4340-8401-29f7efbb3c58"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610754 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610792 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2358fd8-739e-4340-8401-29f7efbb3c58-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610804 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610815 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2358fd8-739e-4340-8401-29f7efbb3c58-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610825 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2358fd8-739e-4340-8401-29f7efbb3c58-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.610835 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnsxn\" (UniqueName: \"kubernetes.io/projected/a2358fd8-739e-4340-8401-29f7efbb3c58-kube-api-access-fnsxn\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:47 crc kubenswrapper[4831]: I0309 16:37:47.626417 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2358fd8-739e-4340-8401-29f7efbb3c58" path="/var/lib/kubelet/pods/a2358fd8-739e-4340-8401-29f7efbb3c58/volumes" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.119916 4831 scope.go:117] "RemoveContainer" containerID="679bfb8e7a0619002244f946d608b35a121c4200895b41b031857bf4361ba88a" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.119959 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4s7m7" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.622959 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh"] Mar 09 16:37:48 crc kubenswrapper[4831]: E0309 16:37:48.623663 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2358fd8-739e-4340-8401-29f7efbb3c58" containerName="swift-ring-rebalance" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.623683 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2358fd8-739e-4340-8401-29f7efbb3c58" containerName="swift-ring-rebalance" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.623935 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2358fd8-739e-4340-8401-29f7efbb3c58" containerName="swift-ring-rebalance" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.624544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.626438 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.627025 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.638037 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh"] Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.726589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.726827 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.726899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.727022 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.727081 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295w5\" (UniqueName: \"kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.727119 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829092 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829245 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829305 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829353 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295w5\" (UniqueName: \"kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.829882 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.831646 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.831784 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.834162 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.834616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.854005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295w5\" (UniqueName: \"kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5\") pod \"swift-ring-rebalance-debug-xjrrh\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:48 crc kubenswrapper[4831]: I0309 16:37:48.944515 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:49 crc kubenswrapper[4831]: I0309 16:37:49.342339 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh"] Mar 09 16:37:49 crc kubenswrapper[4831]: W0309 16:37:49.351224 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551ac1e1_59d8_4fd8_9a70_87904aba656e.slice/crio-20f1ef6ff3aaaedb261e35c8518975e8b29084d027ce30258d0794564d5c0776 WatchSource:0}: Error finding container 20f1ef6ff3aaaedb261e35c8518975e8b29084d027ce30258d0794564d5c0776: Status 404 returned error can't find the container with id 20f1ef6ff3aaaedb261e35c8518975e8b29084d027ce30258d0794564d5c0776 Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.139985 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" event={"ID":"551ac1e1-59d8-4fd8-9a70-87904aba656e","Type":"ContainerStarted","Data":"d2d58cdb4fc9a6d3600540da71b318ce81e478473c51ecdcd7b9c39a67d624ce"} Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.140330 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" event={"ID":"551ac1e1-59d8-4fd8-9a70-87904aba656e","Type":"ContainerStarted","Data":"20f1ef6ff3aaaedb261e35c8518975e8b29084d027ce30258d0794564d5c0776"} Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.166258 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" podStartSLOduration=2.166234844 podStartE2EDuration="2.166234844s" podCreationTimestamp="2026-03-09 16:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:50.161008466 +0000 UTC m=+2397.294690899" watchObservedRunningTime="2026-03-09 16:37:50.166234844 +0000 UTC m=+2397.299917267" Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.657113 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.657453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:50 crc kubenswrapper[4831]: I0309 16:37:50.720744 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:51 crc kubenswrapper[4831]: I0309 16:37:51.155432 4831 generic.go:334] "Generic (PLEG): container finished" podID="551ac1e1-59d8-4fd8-9a70-87904aba656e" containerID="d2d58cdb4fc9a6d3600540da71b318ce81e478473c51ecdcd7b9c39a67d624ce" exitCode=0 Mar 09 16:37:51 crc kubenswrapper[4831]: I0309 16:37:51.155581 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" event={"ID":"551ac1e1-59d8-4fd8-9a70-87904aba656e","Type":"ContainerDied","Data":"d2d58cdb4fc9a6d3600540da71b318ce81e478473c51ecdcd7b9c39a67d624ce"} Mar 09 16:37:51 crc kubenswrapper[4831]: I0309 16:37:51.213496 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:51 crc kubenswrapper[4831]: I0309 16:37:51.618002 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:37:51 crc kubenswrapper[4831]: E0309 16:37:51.618285 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.447585 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.480864 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh"] Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.485981 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh"] Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585653 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585767 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585811 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585854 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295w5\" (UniqueName: \"kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.585966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices\") pod \"551ac1e1-59d8-4fd8-9a70-87904aba656e\" (UID: \"551ac1e1-59d8-4fd8-9a70-87904aba656e\") " Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.586782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.586874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.591233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5" (OuterVolumeSpecName: "kube-api-access-295w5") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "kube-api-access-295w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.604584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts" (OuterVolumeSpecName: "scripts") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.608510 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.626435 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "551ac1e1-59d8-4fd8-9a70-87904aba656e" (UID: "551ac1e1-59d8-4fd8-9a70-87904aba656e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.690271 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.690966 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295w5\" (UniqueName: \"kubernetes.io/projected/551ac1e1-59d8-4fd8-9a70-87904aba656e-kube-api-access-295w5\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.691228 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/551ac1e1-59d8-4fd8-9a70-87904aba656e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.691261 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.691274 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/551ac1e1-59d8-4fd8-9a70-87904aba656e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:52 crc kubenswrapper[4831]: I0309 16:37:52.691433 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/551ac1e1-59d8-4fd8-9a70-87904aba656e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.181560 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20f1ef6ff3aaaedb261e35c8518975e8b29084d027ce30258d0794564d5c0776" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.181592 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xjrrh" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.647124 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551ac1e1-59d8-4fd8-9a70-87904aba656e" path="/var/lib/kubelet/pods/551ac1e1-59d8-4fd8-9a70-87904aba656e/volumes" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.648591 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4"] Mar 09 16:37:53 crc kubenswrapper[4831]: E0309 16:37:53.649988 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551ac1e1-59d8-4fd8-9a70-87904aba656e" containerName="swift-ring-rebalance" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.650007 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="551ac1e1-59d8-4fd8-9a70-87904aba656e" containerName="swift-ring-rebalance" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.650171 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="551ac1e1-59d8-4fd8-9a70-87904aba656e" containerName="swift-ring-rebalance" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.650777 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.651015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4"] Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.653106 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.653374 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704285 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704638 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704683 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.704783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2cgn\" (UniqueName: \"kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.806893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.807008 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.807073 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.807157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.807187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.807296 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2cgn\" (UniqueName: \"kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.808075 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.808509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.808598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.814211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.814736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.826584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2cgn\" (UniqueName: \"kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn\") pod \"swift-ring-rebalance-debug-wn8t4\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:53 crc kubenswrapper[4831]: I0309 16:37:53.989071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:54 crc kubenswrapper[4831]: I0309 16:37:54.318214 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:54 crc kubenswrapper[4831]: I0309 16:37:54.318739 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-276cz" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="registry-server" containerID="cri-o://3f83a8a0e64182f67fa64f0da1491c6f8f29f58a32091b272e197c6d7c182324" gracePeriod=2 Mar 09 16:37:54 crc kubenswrapper[4831]: W0309 16:37:54.455084 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3f5a536_4066_4089_87c0_12b7caa0f29c.slice/crio-2700956fbc444d2aa268e3312d0d6603bc958ad39cba2a1ff5ae6141d0302867 WatchSource:0}: Error finding container 2700956fbc444d2aa268e3312d0d6603bc958ad39cba2a1ff5ae6141d0302867: Status 404 returned error can't find the container with id 2700956fbc444d2aa268e3312d0d6603bc958ad39cba2a1ff5ae6141d0302867 Mar 09 16:37:54 crc kubenswrapper[4831]: I0309 16:37:54.458537 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4"] Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.215298 4831 generic.go:334] "Generic (PLEG): container finished" podID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerID="3f83a8a0e64182f67fa64f0da1491c6f8f29f58a32091b272e197c6d7c182324" exitCode=0 Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.215384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerDied","Data":"3f83a8a0e64182f67fa64f0da1491c6f8f29f58a32091b272e197c6d7c182324"} Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.217857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" event={"ID":"f3f5a536-4066-4089-87c0-12b7caa0f29c","Type":"ContainerStarted","Data":"1b0cd6a0c4db6b08148beec7c654ac778701f426f22dcbcd32faf388d7808e62"} Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.217878 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" event={"ID":"f3f5a536-4066-4089-87c0-12b7caa0f29c","Type":"ContainerStarted","Data":"2700956fbc444d2aa268e3312d0d6603bc958ad39cba2a1ff5ae6141d0302867"} Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.242245 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" podStartSLOduration=2.242227653 podStartE2EDuration="2.242227653s" podCreationTimestamp="2026-03-09 16:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:37:55.238183529 +0000 UTC m=+2402.371865952" watchObservedRunningTime="2026-03-09 16:37:55.242227653 +0000 UTC m=+2402.375910086" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.279901 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.340630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities\") pod \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.340672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content\") pod \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.340713 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmpfs\" (UniqueName: \"kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs\") pod \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\" (UID: \"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b\") " Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.341374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities" (OuterVolumeSpecName: "utilities") pod "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" (UID: "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.344965 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs" (OuterVolumeSpecName: "kube-api-access-kmpfs") pod "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" (UID: "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b"). InnerVolumeSpecName "kube-api-access-kmpfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.396205 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" (UID: "f9fb4ed2-406c-45c5-b566-a4a26df9fd2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.442852 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.442885 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:55 crc kubenswrapper[4831]: I0309 16:37:55.442906 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmpfs\" (UniqueName: \"kubernetes.io/projected/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b-kube-api-access-kmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.232382 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-276cz" event={"ID":"f9fb4ed2-406c-45c5-b566-a4a26df9fd2b","Type":"ContainerDied","Data":"0971fa96565883b05b2db58e071ba30fd6d40eae4e26047fc3e8cfb94c0f10c0"} Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.232708 4831 scope.go:117] "RemoveContainer" containerID="3f83a8a0e64182f67fa64f0da1491c6f8f29f58a32091b272e197c6d7c182324" Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.232438 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-276cz" Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.235231 4831 generic.go:334] "Generic (PLEG): container finished" podID="f3f5a536-4066-4089-87c0-12b7caa0f29c" containerID="1b0cd6a0c4db6b08148beec7c654ac778701f426f22dcbcd32faf388d7808e62" exitCode=0 Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.235306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" event={"ID":"f3f5a536-4066-4089-87c0-12b7caa0f29c","Type":"ContainerDied","Data":"1b0cd6a0c4db6b08148beec7c654ac778701f426f22dcbcd32faf388d7808e62"} Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.265888 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.269663 4831 scope.go:117] "RemoveContainer" containerID="f2dc2f314581ffb67d483c238987cc52b430aa1519cfe0d0f0e16406a2f3e6ba" Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.276779 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-276cz"] Mar 09 16:37:56 crc kubenswrapper[4831]: I0309 16:37:56.298748 4831 scope.go:117] "RemoveContainer" containerID="f8f67180f107fdd184124a3a55ac09c72f837e6d09c78a23209be1fc8626e1ef" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.492883 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.539933 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4"] Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.548853 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4"] Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577224 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577307 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577330 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577373 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2cgn\" (UniqueName: \"kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.577503 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices\") pod \"f3f5a536-4066-4089-87c0-12b7caa0f29c\" (UID: \"f3f5a536-4066-4089-87c0-12b7caa0f29c\") " Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.578445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.578718 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.585899 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn" (OuterVolumeSpecName: "kube-api-access-h2cgn") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "kube-api-access-h2cgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.599484 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.603372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts" (OuterVolumeSpecName: "scripts") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.603476 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f3f5a536-4066-4089-87c0-12b7caa0f29c" (UID: "f3f5a536-4066-4089-87c0-12b7caa0f29c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.627501 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f5a536-4066-4089-87c0-12b7caa0f29c" path="/var/lib/kubelet/pods/f3f5a536-4066-4089-87c0-12b7caa0f29c/volumes" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.628019 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" path="/var/lib/kubelet/pods/f9fb4ed2-406c-45c5-b566-a4a26df9fd2b/volumes" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679691 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679718 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679727 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3f5a536-4066-4089-87c0-12b7caa0f29c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679737 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3f5a536-4066-4089-87c0-12b7caa0f29c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679763 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2cgn\" (UniqueName: \"kubernetes.io/projected/f3f5a536-4066-4089-87c0-12b7caa0f29c-kube-api-access-h2cgn\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:57 crc kubenswrapper[4831]: I0309 16:37:57.679772 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3f5a536-4066-4089-87c0-12b7caa0f29c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.258827 4831 scope.go:117] "RemoveContainer" containerID="1b0cd6a0c4db6b08148beec7c654ac778701f426f22dcbcd32faf388d7808e62" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.258878 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wn8t4" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.659596 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c57c5"] Mar 09 16:37:58 crc kubenswrapper[4831]: E0309 16:37:58.659929 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="registry-server" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.659944 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="registry-server" Mar 09 16:37:58 crc kubenswrapper[4831]: E0309 16:37:58.659957 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="extract-content" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.659965 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="extract-content" Mar 09 16:37:58 crc kubenswrapper[4831]: E0309 16:37:58.659983 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f5a536-4066-4089-87c0-12b7caa0f29c" containerName="swift-ring-rebalance" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.659992 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f5a536-4066-4089-87c0-12b7caa0f29c" containerName="swift-ring-rebalance" Mar 09 16:37:58 crc kubenswrapper[4831]: E0309 16:37:58.660018 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="extract-utilities" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.660026 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="extract-utilities" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.660190 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f5a536-4066-4089-87c0-12b7caa0f29c" containerName="swift-ring-rebalance" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.660224 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9fb4ed2-406c-45c5-b566-a4a26df9fd2b" containerName="registry-server" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.660909 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.663757 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.664078 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.668462 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c57c5"] Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj57g\" (UniqueName: \"kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696865 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.696990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798256 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798364 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798457 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj57g\" (UniqueName: \"kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.798553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.799081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.799503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.799513 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.802718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.802793 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.820115 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj57g\" (UniqueName: \"kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g\") pod \"swift-ring-rebalance-debug-c57c5\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:58 crc kubenswrapper[4831]: I0309 16:37:58.977695 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:37:59 crc kubenswrapper[4831]: I0309 16:37:59.409106 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c57c5"] Mar 09 16:37:59 crc kubenswrapper[4831]: W0309 16:37:59.417348 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd446e0_6bdf_4b86_a9be_d034cf5fdc3a.slice/crio-9b2aa32e16042e72b089969d0d81252224d7c567810e82fd10cdbb64072aef4e WatchSource:0}: Error finding container 9b2aa32e16042e72b089969d0d81252224d7c567810e82fd10cdbb64072aef4e: Status 404 returned error can't find the container with id 9b2aa32e16042e72b089969d0d81252224d7c567810e82fd10cdbb64072aef4e Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.161680 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551238-8jjbk"] Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.163340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.168154 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551238-8jjbk"] Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.173924 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.174258 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.174752 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.220974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkmw\" (UniqueName: \"kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw\") pod \"auto-csr-approver-29551238-8jjbk\" (UID: \"f27a2831-75a5-4e0b-91f2-4320c114928b\") " pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.281926 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" event={"ID":"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a","Type":"ContainerStarted","Data":"c47111938c0344d6411cad0b21e657b51cc3160f9b044e63632db58785d9dd44"} Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.281971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" event={"ID":"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a","Type":"ContainerStarted","Data":"9b2aa32e16042e72b089969d0d81252224d7c567810e82fd10cdbb64072aef4e"} Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.300925 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" podStartSLOduration=2.300908893 podStartE2EDuration="2.300908893s" podCreationTimestamp="2026-03-09 16:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:00.2951957 +0000 UTC m=+2407.428878123" watchObservedRunningTime="2026-03-09 16:38:00.300908893 +0000 UTC m=+2407.434591336" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.322551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkmw\" (UniqueName: \"kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw\") pod \"auto-csr-approver-29551238-8jjbk\" (UID: \"f27a2831-75a5-4e0b-91f2-4320c114928b\") " pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.345107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkmw\" (UniqueName: \"kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw\") pod \"auto-csr-approver-29551238-8jjbk\" (UID: \"f27a2831-75a5-4e0b-91f2-4320c114928b\") " pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.487148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:00 crc kubenswrapper[4831]: I0309 16:38:00.941484 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551238-8jjbk"] Mar 09 16:38:00 crc kubenswrapper[4831]: W0309 16:38:00.947128 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf27a2831_75a5_4e0b_91f2_4320c114928b.slice/crio-e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6 WatchSource:0}: Error finding container e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6: Status 404 returned error can't find the container with id e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6 Mar 09 16:38:01 crc kubenswrapper[4831]: I0309 16:38:01.289756 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" containerID="c47111938c0344d6411cad0b21e657b51cc3160f9b044e63632db58785d9dd44" exitCode=0 Mar 09 16:38:01 crc kubenswrapper[4831]: I0309 16:38:01.289814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" event={"ID":"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a","Type":"ContainerDied","Data":"c47111938c0344d6411cad0b21e657b51cc3160f9b044e63632db58785d9dd44"} Mar 09 16:38:01 crc kubenswrapper[4831]: I0309 16:38:01.292490 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" event={"ID":"f27a2831-75a5-4e0b-91f2-4320c114928b","Type":"ContainerStarted","Data":"e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6"} Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.302972 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" event={"ID":"f27a2831-75a5-4e0b-91f2-4320c114928b","Type":"ContainerStarted","Data":"fe3f7da4548420787610f1f78236bd25a4553484f327e6ef5c0478073959081b"} Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.325185 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" podStartSLOduration=1.279849119 podStartE2EDuration="2.3251677s" podCreationTimestamp="2026-03-09 16:38:00 +0000 UTC" firstStartedPulling="2026-03-09 16:38:00.950384761 +0000 UTC m=+2408.084067184" lastFinishedPulling="2026-03-09 16:38:01.995703322 +0000 UTC m=+2409.129385765" observedRunningTime="2026-03-09 16:38:02.31810551 +0000 UTC m=+2409.451787953" watchObservedRunningTime="2026-03-09 16:38:02.3251677 +0000 UTC m=+2409.458850123" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.588490 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.622871 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c57c5"] Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.627768 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c57c5"] Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.761915 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762148 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762425 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762564 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762670 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj57g\" (UniqueName: \"kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g\") pod \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\" (UID: \"3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a\") " Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762666 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.762797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.763261 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.763340 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.767863 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g" (OuterVolumeSpecName: "kube-api-access-cj57g") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "kube-api-access-cj57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.782067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts" (OuterVolumeSpecName: "scripts") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.783190 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.783624 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" (UID: "3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.864774 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.864821 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.864837 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:02 crc kubenswrapper[4831]: I0309 16:38:02.864850 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj57g\" (UniqueName: \"kubernetes.io/projected/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a-kube-api-access-cj57g\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.236176 4831 scope.go:117] "RemoveContainer" containerID="00e1ece29cdda227161271a4e4c5992cc396e33ea6aeda03b68ad6615faf2941" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.319803 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2aa32e16042e72b089969d0d81252224d7c567810e82fd10cdbb64072aef4e" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.320634 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c57c5" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.324169 4831 generic.go:334] "Generic (PLEG): container finished" podID="f27a2831-75a5-4e0b-91f2-4320c114928b" containerID="fe3f7da4548420787610f1f78236bd25a4553484f327e6ef5c0478073959081b" exitCode=0 Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.324250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" event={"ID":"f27a2831-75a5-4e0b-91f2-4320c114928b","Type":"ContainerDied","Data":"fe3f7da4548420787610f1f78236bd25a4553484f327e6ef5c0478073959081b"} Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.628593 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" path="/var/lib/kubelet/pods/3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a/volumes" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.758042 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rx55f"] Mar 09 16:38:03 crc kubenswrapper[4831]: E0309 16:38:03.758316 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" containerName="swift-ring-rebalance" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.758332 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" containerName="swift-ring-rebalance" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.758491 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd446e0-6bdf-4b86-a9be-d034cf5fdc3a" containerName="swift-ring-rebalance" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.758969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.761136 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.761994 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.766534 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rx55f"] Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.776935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjkh\" (UniqueName: \"kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.777321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.777521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.777604 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.777739 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.777825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.878968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjkh\" (UniqueName: \"kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879029 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879581 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.879986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.880004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.884322 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.884560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:03 crc kubenswrapper[4831]: I0309 16:38:03.900080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjkh\" (UniqueName: \"kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh\") pod \"swift-ring-rebalance-debug-rx55f\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.084093 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.500684 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rx55f"] Mar 09 16:38:04 crc kubenswrapper[4831]: W0309 16:38:04.504966 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d53224_cd7a_4eb5_893c_4e764335918b.slice/crio-2d65dcfb5fab0d6186effe468475dcf57710b16624f2924abed56c8483e1020f WatchSource:0}: Error finding container 2d65dcfb5fab0d6186effe468475dcf57710b16624f2924abed56c8483e1020f: Status 404 returned error can't find the container with id 2d65dcfb5fab0d6186effe468475dcf57710b16624f2924abed56c8483e1020f Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.616870 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.616970 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:38:04 crc kubenswrapper[4831]: E0309 16:38:04.617178 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.694594 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkmw\" (UniqueName: \"kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw\") pod \"f27a2831-75a5-4e0b-91f2-4320c114928b\" (UID: \"f27a2831-75a5-4e0b-91f2-4320c114928b\") " Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.700544 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw" (OuterVolumeSpecName: "kube-api-access-cfkmw") pod "f27a2831-75a5-4e0b-91f2-4320c114928b" (UID: "f27a2831-75a5-4e0b-91f2-4320c114928b"). InnerVolumeSpecName "kube-api-access-cfkmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:04 crc kubenswrapper[4831]: I0309 16:38:04.796093 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkmw\" (UniqueName: \"kubernetes.io/projected/f27a2831-75a5-4e0b-91f2-4320c114928b-kube-api-access-cfkmw\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.348066 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" event={"ID":"a3d53224-cd7a-4eb5-893c-4e764335918b","Type":"ContainerStarted","Data":"fc3fabf66216882a5aafded15bed68839b0b511c431c04eaf20cf8c41871b256"} Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.348351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" event={"ID":"a3d53224-cd7a-4eb5-893c-4e764335918b","Type":"ContainerStarted","Data":"2d65dcfb5fab0d6186effe468475dcf57710b16624f2924abed56c8483e1020f"} Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.354540 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" event={"ID":"f27a2831-75a5-4e0b-91f2-4320c114928b","Type":"ContainerDied","Data":"e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6"} Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.354605 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e195cc0bc05146570457dba6e052a488bd948126ad3963a9ce730d1fa1ffaac6" Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.354670 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551238-8jjbk" Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.375072 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" podStartSLOduration=2.37505217 podStartE2EDuration="2.37505217s" podCreationTimestamp="2026-03-09 16:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:05.372099407 +0000 UTC m=+2412.505781830" watchObservedRunningTime="2026-03-09 16:38:05.37505217 +0000 UTC m=+2412.508734593" Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.402266 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551232-d4cc9"] Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.410530 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551232-d4cc9"] Mar 09 16:38:05 crc kubenswrapper[4831]: I0309 16:38:05.624950 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd99a0c8-1885-451a-adfd-9b493fbf5e7f" path="/var/lib/kubelet/pods/cd99a0c8-1885-451a-adfd-9b493fbf5e7f/volumes" Mar 09 16:38:06 crc kubenswrapper[4831]: I0309 16:38:06.363478 4831 generic.go:334] "Generic (PLEG): container finished" podID="a3d53224-cd7a-4eb5-893c-4e764335918b" containerID="fc3fabf66216882a5aafded15bed68839b0b511c431c04eaf20cf8c41871b256" exitCode=0 Mar 09 16:38:06 crc kubenswrapper[4831]: I0309 16:38:06.363756 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" event={"ID":"a3d53224-cd7a-4eb5-893c-4e764335918b","Type":"ContainerDied","Data":"fc3fabf66216882a5aafded15bed68839b0b511c431c04eaf20cf8c41871b256"} Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.646241 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.690993 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rx55f"] Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.696412 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rx55f"] Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.834752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.834893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.834946 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.835026 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.835080 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxjkh\" (UniqueName: \"kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.835118 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts\") pod \"a3d53224-cd7a-4eb5-893c-4e764335918b\" (UID: \"a3d53224-cd7a-4eb5-893c-4e764335918b\") " Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.836098 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.836202 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.836502 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.836521 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3d53224-cd7a-4eb5-893c-4e764335918b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.842605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh" (OuterVolumeSpecName: "kube-api-access-kxjkh") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "kube-api-access-kxjkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.864209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts" (OuterVolumeSpecName: "scripts") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.868536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.886452 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a3d53224-cd7a-4eb5-893c-4e764335918b" (UID: "a3d53224-cd7a-4eb5-893c-4e764335918b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.937562 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxjkh\" (UniqueName: \"kubernetes.io/projected/a3d53224-cd7a-4eb5-893c-4e764335918b-kube-api-access-kxjkh\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.937667 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d53224-cd7a-4eb5-893c-4e764335918b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.937686 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:07 crc kubenswrapper[4831]: I0309 16:38:07.937702 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3d53224-cd7a-4eb5-893c-4e764335918b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.385450 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d65dcfb5fab0d6186effe468475dcf57710b16624f2924abed56c8483e1020f" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.385471 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rx55f" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.828952 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4"] Mar 09 16:38:08 crc kubenswrapper[4831]: E0309 16:38:08.829621 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27a2831-75a5-4e0b-91f2-4320c114928b" containerName="oc" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.829635 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27a2831-75a5-4e0b-91f2-4320c114928b" containerName="oc" Mar 09 16:38:08 crc kubenswrapper[4831]: E0309 16:38:08.829647 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d53224-cd7a-4eb5-893c-4e764335918b" containerName="swift-ring-rebalance" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.829653 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d53224-cd7a-4eb5-893c-4e764335918b" containerName="swift-ring-rebalance" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.829816 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d53224-cd7a-4eb5-893c-4e764335918b" containerName="swift-ring-rebalance" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.829832 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27a2831-75a5-4e0b-91f2-4320c114928b" containerName="oc" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.830517 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.835022 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.835419 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.845468 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4"] Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.849794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.849939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.849985 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.850023 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.850041 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974cn\" (UniqueName: \"kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.850064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.951162 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.951238 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.951263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.952180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.952216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974cn\" (UniqueName: \"kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.952245 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.952503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.952502 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.953180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.956659 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.965895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:08 crc kubenswrapper[4831]: I0309 16:38:08.967348 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974cn\" (UniqueName: \"kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn\") pod \"swift-ring-rebalance-debug-9wcq4\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:09 crc kubenswrapper[4831]: I0309 16:38:09.153775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:09 crc kubenswrapper[4831]: I0309 16:38:09.602858 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4"] Mar 09 16:38:09 crc kubenswrapper[4831]: W0309 16:38:09.610676 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1373fb8_b81b_422e_9afb_55f21b1b0ab2.slice/crio-6d284b9d0a74f5c69274b0e53872e6db3581ba3fe321c14a13c6234a28531a73 WatchSource:0}: Error finding container 6d284b9d0a74f5c69274b0e53872e6db3581ba3fe321c14a13c6234a28531a73: Status 404 returned error can't find the container with id 6d284b9d0a74f5c69274b0e53872e6db3581ba3fe321c14a13c6234a28531a73 Mar 09 16:38:09 crc kubenswrapper[4831]: I0309 16:38:09.626880 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d53224-cd7a-4eb5-893c-4e764335918b" path="/var/lib/kubelet/pods/a3d53224-cd7a-4eb5-893c-4e764335918b/volumes" Mar 09 16:38:10 crc kubenswrapper[4831]: I0309 16:38:10.403624 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" event={"ID":"f1373fb8-b81b-422e-9afb-55f21b1b0ab2","Type":"ContainerStarted","Data":"706598ceb64784f66c6e7128738053a81edf97bc4e403492d89b0539d93b4594"} Mar 09 16:38:10 crc kubenswrapper[4831]: I0309 16:38:10.403663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" event={"ID":"f1373fb8-b81b-422e-9afb-55f21b1b0ab2","Type":"ContainerStarted","Data":"6d284b9d0a74f5c69274b0e53872e6db3581ba3fe321c14a13c6234a28531a73"} Mar 09 16:38:10 crc kubenswrapper[4831]: I0309 16:38:10.444939 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" podStartSLOduration=2.444916186 podStartE2EDuration="2.444916186s" podCreationTimestamp="2026-03-09 16:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:10.441468848 +0000 UTC m=+2417.575151271" watchObservedRunningTime="2026-03-09 16:38:10.444916186 +0000 UTC m=+2417.578598609" Mar 09 16:38:11 crc kubenswrapper[4831]: I0309 16:38:11.412526 4831 generic.go:334] "Generic (PLEG): container finished" podID="f1373fb8-b81b-422e-9afb-55f21b1b0ab2" containerID="706598ceb64784f66c6e7128738053a81edf97bc4e403492d89b0539d93b4594" exitCode=0 Mar 09 16:38:11 crc kubenswrapper[4831]: I0309 16:38:11.412570 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" event={"ID":"f1373fb8-b81b-422e-9afb-55f21b1b0ab2","Type":"ContainerDied","Data":"706598ceb64784f66c6e7128738053a81edf97bc4e403492d89b0539d93b4594"} Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.714223 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.742619 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4"] Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.748673 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4"] Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804438 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974cn\" (UniqueName: \"kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804486 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804567 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804595 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.804681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf\") pod \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\" (UID: \"f1373fb8-b81b-422e-9afb-55f21b1b0ab2\") " Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.805156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.806134 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.814177 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn" (OuterVolumeSpecName: "kube-api-access-974cn") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "kube-api-access-974cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.827245 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.828114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.829597 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts" (OuterVolumeSpecName: "scripts") pod "f1373fb8-b81b-422e-9afb-55f21b1b0ab2" (UID: "f1373fb8-b81b-422e-9afb-55f21b1b0ab2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906159 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906195 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974cn\" (UniqueName: \"kubernetes.io/projected/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-kube-api-access-974cn\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906209 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906221 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906231 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:12 crc kubenswrapper[4831]: I0309 16:38:12.906242 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1373fb8-b81b-422e-9afb-55f21b1b0ab2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.428173 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d284b9d0a74f5c69274b0e53872e6db3581ba3fe321c14a13c6234a28531a73" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.428250 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9wcq4" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.625609 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1373fb8-b81b-422e-9afb-55f21b1b0ab2" path="/var/lib/kubelet/pods/f1373fb8-b81b-422e-9afb-55f21b1b0ab2/volumes" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.871265 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5"] Mar 09 16:38:13 crc kubenswrapper[4831]: E0309 16:38:13.871639 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1373fb8-b81b-422e-9afb-55f21b1b0ab2" containerName="swift-ring-rebalance" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.871654 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1373fb8-b81b-422e-9afb-55f21b1b0ab2" containerName="swift-ring-rebalance" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.871871 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1373fb8-b81b-422e-9afb-55f21b1b0ab2" containerName="swift-ring-rebalance" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.872516 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.874742 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.875504 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:13 crc kubenswrapper[4831]: I0309 16:38:13.880555 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5"] Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023492 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023557 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psggn\" (UniqueName: \"kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023613 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023705 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.023754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.124970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.125047 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.125105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.125128 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psggn\" (UniqueName: \"kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.125162 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.125186 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.127440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.127748 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.127839 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.129826 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.131358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.146691 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psggn\" (UniqueName: \"kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn\") pod \"swift-ring-rebalance-debug-8w6p5\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.202525 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:14 crc kubenswrapper[4831]: I0309 16:38:14.658818 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5"] Mar 09 16:38:15 crc kubenswrapper[4831]: I0309 16:38:15.447832 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" event={"ID":"8a4090c5-89d9-4b4a-829c-7c259f167f95","Type":"ContainerStarted","Data":"84bf6e168e59d432418cf76097abc50a141947e970253ea3a5086be439f761cf"} Mar 09 16:38:15 crc kubenswrapper[4831]: I0309 16:38:15.448175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" event={"ID":"8a4090c5-89d9-4b4a-829c-7c259f167f95","Type":"ContainerStarted","Data":"551b291c64fc77fd55e54958645eecbbc946754a7afc3a6fdce7feeae5e32f87"} Mar 09 16:38:15 crc kubenswrapper[4831]: I0309 16:38:15.473442 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" podStartSLOduration=2.473427729 podStartE2EDuration="2.473427729s" podCreationTimestamp="2026-03-09 16:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:15.473171991 +0000 UTC m=+2422.606854424" watchObservedRunningTime="2026-03-09 16:38:15.473427729 +0000 UTC m=+2422.607110152" Mar 09 16:38:16 crc kubenswrapper[4831]: I0309 16:38:16.470589 4831 generic.go:334] "Generic (PLEG): container finished" podID="8a4090c5-89d9-4b4a-829c-7c259f167f95" containerID="84bf6e168e59d432418cf76097abc50a141947e970253ea3a5086be439f761cf" exitCode=0 Mar 09 16:38:16 crc kubenswrapper[4831]: I0309 16:38:16.470691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" event={"ID":"8a4090c5-89d9-4b4a-829c-7c259f167f95","Type":"ContainerDied","Data":"84bf6e168e59d432418cf76097abc50a141947e970253ea3a5086be439f761cf"} Mar 09 16:38:16 crc kubenswrapper[4831]: I0309 16:38:16.617579 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:38:16 crc kubenswrapper[4831]: E0309 16:38:16.617912 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.739490 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.777875 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5"] Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.786388 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5"] Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884113 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884266 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psggn\" (UniqueName: \"kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884446 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf\") pod \"8a4090c5-89d9-4b4a-829c-7c259f167f95\" (UID: \"8a4090c5-89d9-4b4a-829c-7c259f167f95\") " Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884829 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.884955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.888432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn" (OuterVolumeSpecName: "kube-api-access-psggn") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "kube-api-access-psggn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.906659 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.908318 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.914274 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts" (OuterVolumeSpecName: "scripts") pod "8a4090c5-89d9-4b4a-829c-7c259f167f95" (UID: "8a4090c5-89d9-4b4a-829c-7c259f167f95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986613 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psggn\" (UniqueName: \"kubernetes.io/projected/8a4090c5-89d9-4b4a-829c-7c259f167f95-kube-api-access-psggn\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986644 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986653 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986662 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a4090c5-89d9-4b4a-829c-7c259f167f95-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986669 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a4090c5-89d9-4b4a-829c-7c259f167f95-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:17 crc kubenswrapper[4831]: I0309 16:38:17.986678 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a4090c5-89d9-4b4a-829c-7c259f167f95-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.493829 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="551b291c64fc77fd55e54958645eecbbc946754a7afc3a6fdce7feeae5e32f87" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.494096 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8w6p5" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.916617 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2"] Mar 09 16:38:18 crc kubenswrapper[4831]: E0309 16:38:18.917169 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a4090c5-89d9-4b4a-829c-7c259f167f95" containerName="swift-ring-rebalance" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.917181 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a4090c5-89d9-4b4a-829c-7c259f167f95" containerName="swift-ring-rebalance" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.917346 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a4090c5-89d9-4b4a-829c-7c259f167f95" containerName="swift-ring-rebalance" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.917872 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.921529 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.921545 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:18 crc kubenswrapper[4831]: I0309 16:38:18.935063 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2"] Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs4vk\" (UniqueName: \"kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102384 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102490 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.102782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.204768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs4vk\" (UniqueName: \"kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.205093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.205225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.205358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.205490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.205607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.207224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.207259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.207745 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.209188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.217036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.223214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs4vk\" (UniqueName: \"kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk\") pod \"swift-ring-rebalance-debug-zwgm2\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.238320 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.527304 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2"] Mar 09 16:38:19 crc kubenswrapper[4831]: I0309 16:38:19.635643 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a4090c5-89d9-4b4a-829c-7c259f167f95" path="/var/lib/kubelet/pods/8a4090c5-89d9-4b4a-829c-7c259f167f95/volumes" Mar 09 16:38:20 crc kubenswrapper[4831]: I0309 16:38:20.535813 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" event={"ID":"d8c9ced4-2272-4ca2-960f-1b3a8421fde1","Type":"ContainerStarted","Data":"45184c129ef0615c5176a8858e9424891eccc2d8d8aad7ed075c73b1023275a3"} Mar 09 16:38:20 crc kubenswrapper[4831]: I0309 16:38:20.537471 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" event={"ID":"d8c9ced4-2272-4ca2-960f-1b3a8421fde1","Type":"ContainerStarted","Data":"409aebe689e6082932589fb332449ac91895891126ce7dec1e6779d2de1d9220"} Mar 09 16:38:20 crc kubenswrapper[4831]: I0309 16:38:20.558616 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" podStartSLOduration=2.558590228 podStartE2EDuration="2.558590228s" podCreationTimestamp="2026-03-09 16:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:20.554250595 +0000 UTC m=+2427.687933048" watchObservedRunningTime="2026-03-09 16:38:20.558590228 +0000 UTC m=+2427.692272671" Mar 09 16:38:21 crc kubenswrapper[4831]: I0309 16:38:21.546171 4831 generic.go:334] "Generic (PLEG): container finished" podID="d8c9ced4-2272-4ca2-960f-1b3a8421fde1" containerID="45184c129ef0615c5176a8858e9424891eccc2d8d8aad7ed075c73b1023275a3" exitCode=0 Mar 09 16:38:21 crc kubenswrapper[4831]: I0309 16:38:21.546217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" event={"ID":"d8c9ced4-2272-4ca2-960f-1b3a8421fde1","Type":"ContainerDied","Data":"45184c129ef0615c5176a8858e9424891eccc2d8d8aad7ed075c73b1023275a3"} Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.854097 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.886801 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2"] Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.892269 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2"] Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998522 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998687 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs4vk\" (UniqueName: \"kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:22 crc kubenswrapper[4831]: I0309 16:38:22.998783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts\") pod \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\" (UID: \"d8c9ced4-2272-4ca2-960f-1b3a8421fde1\") " Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.000023 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.000725 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.003628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk" (OuterVolumeSpecName: "kube-api-access-bs4vk") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "kube-api-access-bs4vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.028460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts" (OuterVolumeSpecName: "scripts") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.031034 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.038569 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d8c9ced4-2272-4ca2-960f-1b3a8421fde1" (UID: "d8c9ced4-2272-4ca2-960f-1b3a8421fde1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100017 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100063 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100077 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100094 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100107 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.100120 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs4vk\" (UniqueName: \"kubernetes.io/projected/d8c9ced4-2272-4ca2-960f-1b3a8421fde1-kube-api-access-bs4vk\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.567660 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="409aebe689e6082932589fb332449ac91895891126ce7dec1e6779d2de1d9220" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.567782 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zwgm2" Mar 09 16:38:23 crc kubenswrapper[4831]: I0309 16:38:23.629203 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c9ced4-2272-4ca2-960f-1b3a8421fde1" path="/var/lib/kubelet/pods/d8c9ced4-2272-4ca2-960f-1b3a8421fde1/volumes" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.014707 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4"] Mar 09 16:38:24 crc kubenswrapper[4831]: E0309 16:38:24.015056 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c9ced4-2272-4ca2-960f-1b3a8421fde1" containerName="swift-ring-rebalance" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.015071 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c9ced4-2272-4ca2-960f-1b3a8421fde1" containerName="swift-ring-rebalance" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.015278 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c9ced4-2272-4ca2-960f-1b3a8421fde1" containerName="swift-ring-rebalance" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.015846 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.018173 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.018328 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.023820 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4"] Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114093 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114231 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114271 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bfx\" (UniqueName: \"kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.114459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.216541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.216681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bfx\" (UniqueName: \"kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.216774 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.216938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.218905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.219133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.219242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.219672 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.220862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.224998 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.231789 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.234850 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bfx\" (UniqueName: \"kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx\") pod \"swift-ring-rebalance-debug-ll9f4\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.335073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:24 crc kubenswrapper[4831]: I0309 16:38:24.575953 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4"] Mar 09 16:38:24 crc kubenswrapper[4831]: W0309 16:38:24.588267 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68be5f94_a55b_458c_a2dd_b4cb958b5587.slice/crio-2b8f84f7efa0de593caf8280ae7a02858f072b4503ba91f79ca78a8c001405c1 WatchSource:0}: Error finding container 2b8f84f7efa0de593caf8280ae7a02858f072b4503ba91f79ca78a8c001405c1: Status 404 returned error can't find the container with id 2b8f84f7efa0de593caf8280ae7a02858f072b4503ba91f79ca78a8c001405c1 Mar 09 16:38:25 crc kubenswrapper[4831]: I0309 16:38:25.599285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" event={"ID":"68be5f94-a55b-458c-a2dd-b4cb958b5587","Type":"ContainerStarted","Data":"954657b9fc4336db0ea9cca1779f37d7196d227522516d476b11ad503f89d698"} Mar 09 16:38:25 crc kubenswrapper[4831]: I0309 16:38:25.599659 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" event={"ID":"68be5f94-a55b-458c-a2dd-b4cb958b5587","Type":"ContainerStarted","Data":"2b8f84f7efa0de593caf8280ae7a02858f072b4503ba91f79ca78a8c001405c1"} Mar 09 16:38:26 crc kubenswrapper[4831]: I0309 16:38:26.609720 4831 generic.go:334] "Generic (PLEG): container finished" podID="68be5f94-a55b-458c-a2dd-b4cb958b5587" containerID="954657b9fc4336db0ea9cca1779f37d7196d227522516d476b11ad503f89d698" exitCode=0 Mar 09 16:38:26 crc kubenswrapper[4831]: I0309 16:38:26.609815 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" event={"ID":"68be5f94-a55b-458c-a2dd-b4cb958b5587","Type":"ContainerDied","Data":"954657b9fc4336db0ea9cca1779f37d7196d227522516d476b11ad503f89d698"} Mar 09 16:38:27 crc kubenswrapper[4831]: I0309 16:38:27.912956 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:27 crc kubenswrapper[4831]: I0309 16:38:27.943626 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4"] Mar 09 16:38:27 crc kubenswrapper[4831]: I0309 16:38:27.955171 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4"] Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.073924 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074324 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bfx\" (UniqueName: \"kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074364 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074421 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074478 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift\") pod \"68be5f94-a55b-458c-a2dd-b4cb958b5587\" (UID: \"68be5f94-a55b-458c-a2dd-b4cb958b5587\") " Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.074831 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.075535 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.080467 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx" (OuterVolumeSpecName: "kube-api-access-m4bfx") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "kube-api-access-m4bfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.097107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts" (OuterVolumeSpecName: "scripts") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.099674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.100731 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "68be5f94-a55b-458c-a2dd-b4cb958b5587" (UID: "68be5f94-a55b-458c-a2dd-b4cb958b5587"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176706 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176746 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176756 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bfx\" (UniqueName: \"kubernetes.io/projected/68be5f94-a55b-458c-a2dd-b4cb958b5587-kube-api-access-m4bfx\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176768 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68be5f94-a55b-458c-a2dd-b4cb958b5587-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176777 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68be5f94-a55b-458c-a2dd-b4cb958b5587-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.176784 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68be5f94-a55b-458c-a2dd-b4cb958b5587-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.632595 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b8f84f7efa0de593caf8280ae7a02858f072b4503ba91f79ca78a8c001405c1" Mar 09 16:38:28 crc kubenswrapper[4831]: I0309 16:38:28.632661 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ll9f4" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.085830 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kztmp"] Mar 09 16:38:29 crc kubenswrapper[4831]: E0309 16:38:29.086631 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68be5f94-a55b-458c-a2dd-b4cb958b5587" containerName="swift-ring-rebalance" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.086662 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="68be5f94-a55b-458c-a2dd-b4cb958b5587" containerName="swift-ring-rebalance" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.087048 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="68be5f94-a55b-458c-a2dd-b4cb958b5587" containerName="swift-ring-rebalance" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.088248 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.091274 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.092151 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.106557 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kztmp"] Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.191697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.191741 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4tx\" (UniqueName: \"kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.191787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.192318 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.192368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.192442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.294962 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295173 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295306 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4tx\" (UniqueName: \"kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.295783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.296128 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.296368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.300503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.300799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.315883 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4tx\" (UniqueName: \"kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx\") pod \"swift-ring-rebalance-debug-kztmp\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.418714 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.629215 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68be5f94-a55b-458c-a2dd-b4cb958b5587" path="/var/lib/kubelet/pods/68be5f94-a55b-458c-a2dd-b4cb958b5587/volumes" Mar 09 16:38:29 crc kubenswrapper[4831]: I0309 16:38:29.854138 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kztmp"] Mar 09 16:38:30 crc kubenswrapper[4831]: I0309 16:38:30.618016 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:38:30 crc kubenswrapper[4831]: E0309 16:38:30.619136 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:38:30 crc kubenswrapper[4831]: I0309 16:38:30.650493 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" event={"ID":"03346a9e-8c9f-4984-ac84-0d2732f05271","Type":"ContainerStarted","Data":"11732d7f96d06d352eb5893fee3f4929e7c3c841d4a3219728e4e723faa97400"} Mar 09 16:38:30 crc kubenswrapper[4831]: I0309 16:38:30.650538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" event={"ID":"03346a9e-8c9f-4984-ac84-0d2732f05271","Type":"ContainerStarted","Data":"2fa3ae41e463a80ce318d2dfbcd9fb64f3f7fa89a02136ad337518c7b2d95b6c"} Mar 09 16:38:30 crc kubenswrapper[4831]: I0309 16:38:30.674602 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" podStartSLOduration=1.674585947 podStartE2EDuration="1.674585947s" podCreationTimestamp="2026-03-09 16:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:30.673444705 +0000 UTC m=+2437.807127128" watchObservedRunningTime="2026-03-09 16:38:30.674585947 +0000 UTC m=+2437.808268371" Mar 09 16:38:31 crc kubenswrapper[4831]: I0309 16:38:31.663935 4831 generic.go:334] "Generic (PLEG): container finished" podID="03346a9e-8c9f-4984-ac84-0d2732f05271" containerID="11732d7f96d06d352eb5893fee3f4929e7c3c841d4a3219728e4e723faa97400" exitCode=0 Mar 09 16:38:31 crc kubenswrapper[4831]: I0309 16:38:31.664010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" event={"ID":"03346a9e-8c9f-4984-ac84-0d2732f05271","Type":"ContainerDied","Data":"11732d7f96d06d352eb5893fee3f4929e7c3c841d4a3219728e4e723faa97400"} Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.025371 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.057778 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kztmp"] Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.066324 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kztmp"] Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151176 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151563 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww4tx\" (UniqueName: \"kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.151989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts\") pod \"03346a9e-8c9f-4984-ac84-0d2732f05271\" (UID: \"03346a9e-8c9f-4984-ac84-0d2732f05271\") " Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.152174 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.152301 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.152310 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.156545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx" (OuterVolumeSpecName: "kube-api-access-ww4tx") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "kube-api-access-ww4tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.174059 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts" (OuterVolumeSpecName: "scripts") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.174147 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.175926 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "03346a9e-8c9f-4984-ac84-0d2732f05271" (UID: "03346a9e-8c9f-4984-ac84-0d2732f05271"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.253454 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww4tx\" (UniqueName: \"kubernetes.io/projected/03346a9e-8c9f-4984-ac84-0d2732f05271-kube-api-access-ww4tx\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.253491 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03346a9e-8c9f-4984-ac84-0d2732f05271-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.253500 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.253509 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03346a9e-8c9f-4984-ac84-0d2732f05271-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.253517 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03346a9e-8c9f-4984-ac84-0d2732f05271-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.629769 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03346a9e-8c9f-4984-ac84-0d2732f05271" path="/var/lib/kubelet/pods/03346a9e-8c9f-4984-ac84-0d2732f05271/volumes" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.686359 4831 scope.go:117] "RemoveContainer" containerID="11732d7f96d06d352eb5893fee3f4929e7c3c841d4a3219728e4e723faa97400" Mar 09 16:38:33 crc kubenswrapper[4831]: I0309 16:38:33.686375 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kztmp" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.216529 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tk46w"] Mar 09 16:38:34 crc kubenswrapper[4831]: E0309 16:38:34.216942 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03346a9e-8c9f-4984-ac84-0d2732f05271" containerName="swift-ring-rebalance" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.216963 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="03346a9e-8c9f-4984-ac84-0d2732f05271" containerName="swift-ring-rebalance" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.217377 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="03346a9e-8c9f-4984-ac84-0d2732f05271" containerName="swift-ring-rebalance" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.218155 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.221833 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.222707 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.224278 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tk46w"] Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369707 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369724 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369762 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.369848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzxn\" (UniqueName: \"kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.470910 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzxn\" (UniqueName: \"kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.470989 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.471822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.472078 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.479749 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.480212 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.487008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzxn\" (UniqueName: \"kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn\") pod \"swift-ring-rebalance-debug-tk46w\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:34 crc kubenswrapper[4831]: I0309 16:38:34.580084 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:35 crc kubenswrapper[4831]: I0309 16:38:35.019872 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tk46w"] Mar 09 16:38:35 crc kubenswrapper[4831]: I0309 16:38:35.722605 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" event={"ID":"e101e2be-23bf-415e-a113-19d684784c7f","Type":"ContainerStarted","Data":"eb6125224acfdf25c904eb2c5d3f2dcf65c444ca9bbb9989a33ceca2b3a91bc6"} Mar 09 16:38:35 crc kubenswrapper[4831]: I0309 16:38:35.722960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" event={"ID":"e101e2be-23bf-415e-a113-19d684784c7f","Type":"ContainerStarted","Data":"a715cbbbdf587f21368ae513e44e8c3c2799742166b7d3955a0e8c4a43d755a4"} Mar 09 16:38:36 crc kubenswrapper[4831]: I0309 16:38:36.735616 4831 generic.go:334] "Generic (PLEG): container finished" podID="e101e2be-23bf-415e-a113-19d684784c7f" containerID="eb6125224acfdf25c904eb2c5d3f2dcf65c444ca9bbb9989a33ceca2b3a91bc6" exitCode=0 Mar 09 16:38:36 crc kubenswrapper[4831]: I0309 16:38:36.735667 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" event={"ID":"e101e2be-23bf-415e-a113-19d684784c7f","Type":"ContainerDied","Data":"eb6125224acfdf25c904eb2c5d3f2dcf65c444ca9bbb9989a33ceca2b3a91bc6"} Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.029928 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.076674 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tk46w"] Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.083483 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tk46w"] Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131246 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131288 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131323 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131375 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131394 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.131456 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjzxn\" (UniqueName: \"kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn\") pod \"e101e2be-23bf-415e-a113-19d684784c7f\" (UID: \"e101e2be-23bf-415e-a113-19d684784c7f\") " Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.132490 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.132924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.137993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn" (OuterVolumeSpecName: "kube-api-access-mjzxn") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "kube-api-access-mjzxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.153967 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts" (OuterVolumeSpecName: "scripts") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.155973 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.158488 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e101e2be-23bf-415e-a113-19d684784c7f" (UID: "e101e2be-23bf-415e-a113-19d684784c7f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233186 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjzxn\" (UniqueName: \"kubernetes.io/projected/e101e2be-23bf-415e-a113-19d684784c7f-kube-api-access-mjzxn\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233538 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233549 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233558 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e101e2be-23bf-415e-a113-19d684784c7f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233566 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e101e2be-23bf-415e-a113-19d684784c7f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.233574 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e101e2be-23bf-415e-a113-19d684784c7f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.756225 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a715cbbbdf587f21368ae513e44e8c3c2799742166b7d3955a0e8c4a43d755a4" Mar 09 16:38:38 crc kubenswrapper[4831]: I0309 16:38:38.756275 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tk46w" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.243428 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-npz79"] Mar 09 16:38:39 crc kubenswrapper[4831]: E0309 16:38:39.243788 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e101e2be-23bf-415e-a113-19d684784c7f" containerName="swift-ring-rebalance" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.243806 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e101e2be-23bf-415e-a113-19d684784c7f" containerName="swift-ring-rebalance" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.244008 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e101e2be-23bf-415e-a113-19d684784c7f" containerName="swift-ring-rebalance" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.244603 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.246968 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.249411 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.254975 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-npz79"] Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.349926 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.350057 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.350107 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.350138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kwt\" (UniqueName: \"kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.350166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.350222 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451364 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451449 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451467 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kwt\" (UniqueName: \"kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.451576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.452032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.452236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.452381 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.457382 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.457385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.476178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kwt\" (UniqueName: \"kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt\") pod \"swift-ring-rebalance-debug-npz79\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.564188 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:39 crc kubenswrapper[4831]: I0309 16:38:39.632993 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e101e2be-23bf-415e-a113-19d684784c7f" path="/var/lib/kubelet/pods/e101e2be-23bf-415e-a113-19d684784c7f/volumes" Mar 09 16:38:40 crc kubenswrapper[4831]: I0309 16:38:40.027937 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-npz79"] Mar 09 16:38:40 crc kubenswrapper[4831]: I0309 16:38:40.773566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" event={"ID":"64c0d22c-354f-4c4a-a250-2ccf4597a31f","Type":"ContainerStarted","Data":"d1706c0fd47c4d86ce2face0f62a3695e0fee6ba271c947f25bbd7dbb2ba6e59"} Mar 09 16:38:40 crc kubenswrapper[4831]: I0309 16:38:40.773919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" event={"ID":"64c0d22c-354f-4c4a-a250-2ccf4597a31f","Type":"ContainerStarted","Data":"c4db0a796eea56798a7ef724954a49552e0e019cfbc133f59e0d36639ff1773f"} Mar 09 16:38:40 crc kubenswrapper[4831]: I0309 16:38:40.793643 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" podStartSLOduration=1.793623092 podStartE2EDuration="1.793623092s" podCreationTimestamp="2026-03-09 16:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:40.788086165 +0000 UTC m=+2447.921768588" watchObservedRunningTime="2026-03-09 16:38:40.793623092 +0000 UTC m=+2447.927305505" Mar 09 16:38:41 crc kubenswrapper[4831]: I0309 16:38:41.788968 4831 generic.go:334] "Generic (PLEG): container finished" podID="64c0d22c-354f-4c4a-a250-2ccf4597a31f" containerID="d1706c0fd47c4d86ce2face0f62a3695e0fee6ba271c947f25bbd7dbb2ba6e59" exitCode=0 Mar 09 16:38:41 crc kubenswrapper[4831]: I0309 16:38:41.789192 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" event={"ID":"64c0d22c-354f-4c4a-a250-2ccf4597a31f","Type":"ContainerDied","Data":"d1706c0fd47c4d86ce2face0f62a3695e0fee6ba271c947f25bbd7dbb2ba6e59"} Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.055790 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.092171 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-npz79"] Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.100248 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-npz79"] Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.205841 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206135 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kwt\" (UniqueName: \"kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206403 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206500 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206572 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf\") pod \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\" (UID: \"64c0d22c-354f-4c4a-a250-2ccf4597a31f\") " Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.206963 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.207231 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.211651 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt" (OuterVolumeSpecName: "kube-api-access-x8kwt") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "kube-api-access-x8kwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.228556 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.230119 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts" (OuterVolumeSpecName: "scripts") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.230442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "64c0d22c-354f-4c4a-a250-2ccf4597a31f" (UID: "64c0d22c-354f-4c4a-a250-2ccf4597a31f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309770 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kwt\" (UniqueName: \"kubernetes.io/projected/64c0d22c-354f-4c4a-a250-2ccf4597a31f-kube-api-access-x8kwt\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309804 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/64c0d22c-354f-4c4a-a250-2ccf4597a31f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309818 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309826 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/64c0d22c-354f-4c4a-a250-2ccf4597a31f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309838 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.309849 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/64c0d22c-354f-4c4a-a250-2ccf4597a31f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.626839 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:38:43 crc kubenswrapper[4831]: E0309 16:38:43.627698 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.630350 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c0d22c-354f-4c4a-a250-2ccf4597a31f" path="/var/lib/kubelet/pods/64c0d22c-354f-4c4a-a250-2ccf4597a31f/volumes" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.810747 4831 scope.go:117] "RemoveContainer" containerID="d1706c0fd47c4d86ce2face0f62a3695e0fee6ba271c947f25bbd7dbb2ba6e59" Mar 09 16:38:43 crc kubenswrapper[4831]: I0309 16:38:43.810810 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-npz79" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.233555 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh"] Mar 09 16:38:44 crc kubenswrapper[4831]: E0309 16:38:44.233941 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0d22c-354f-4c4a-a250-2ccf4597a31f" containerName="swift-ring-rebalance" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.233958 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0d22c-354f-4c4a-a250-2ccf4597a31f" containerName="swift-ring-rebalance" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.234105 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c0d22c-354f-4c4a-a250-2ccf4597a31f" containerName="swift-ring-rebalance" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.234704 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.237293 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.237474 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.244421 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh"] Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323880 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.323994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvkg\" (UniqueName: \"kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426492 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426712 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.426792 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvkg\" (UniqueName: \"kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.428011 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.428565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.429289 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.432429 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.437302 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.448629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvkg\" (UniqueName: \"kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg\") pod \"swift-ring-rebalance-debug-w5ndh\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.554773 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:44 crc kubenswrapper[4831]: I0309 16:38:44.976370 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh"] Mar 09 16:38:45 crc kubenswrapper[4831]: I0309 16:38:45.829533 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" event={"ID":"03cede37-1909-455a-a57b-5295bdd44198","Type":"ContainerStarted","Data":"5cc54073d291c5c55a932670d8944495a404afe39a6fffdcbc968b577dc8b497"} Mar 09 16:38:45 crc kubenswrapper[4831]: I0309 16:38:45.829850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" event={"ID":"03cede37-1909-455a-a57b-5295bdd44198","Type":"ContainerStarted","Data":"d66a825bb3b74fa341969dd8e656c72a49405d672fdcf6ad65ab04271df56d37"} Mar 09 16:38:45 crc kubenswrapper[4831]: I0309 16:38:45.846864 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" podStartSLOduration=1.846844776 podStartE2EDuration="1.846844776s" podCreationTimestamp="2026-03-09 16:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:45.843890932 +0000 UTC m=+2452.977573365" watchObservedRunningTime="2026-03-09 16:38:45.846844776 +0000 UTC m=+2452.980527199" Mar 09 16:38:46 crc kubenswrapper[4831]: I0309 16:38:46.838859 4831 generic.go:334] "Generic (PLEG): container finished" podID="03cede37-1909-455a-a57b-5295bdd44198" containerID="5cc54073d291c5c55a932670d8944495a404afe39a6fffdcbc968b577dc8b497" exitCode=0 Mar 09 16:38:46 crc kubenswrapper[4831]: I0309 16:38:46.838913 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" event={"ID":"03cede37-1909-455a-a57b-5295bdd44198","Type":"ContainerDied","Data":"5cc54073d291c5c55a932670d8944495a404afe39a6fffdcbc968b577dc8b497"} Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.127736 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.161293 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh"] Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.165153 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh"] Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.284895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285041 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285101 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285243 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvkg\" (UniqueName: \"kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285453 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift\") pod \"03cede37-1909-455a-a57b-5295bdd44198\" (UID: \"03cede37-1909-455a-a57b-5295bdd44198\") " Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.285926 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.286726 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.289949 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg" (OuterVolumeSpecName: "kube-api-access-shvkg") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "kube-api-access-shvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.306070 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts" (OuterVolumeSpecName: "scripts") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.306709 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.306982 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "03cede37-1909-455a-a57b-5295bdd44198" (UID: "03cede37-1909-455a-a57b-5295bdd44198"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387704 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387739 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvkg\" (UniqueName: \"kubernetes.io/projected/03cede37-1909-455a-a57b-5295bdd44198-kube-api-access-shvkg\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387753 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03cede37-1909-455a-a57b-5295bdd44198-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387762 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387770 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03cede37-1909-455a-a57b-5295bdd44198-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.387779 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03cede37-1909-455a-a57b-5295bdd44198-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.863058 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66a825bb3b74fa341969dd8e656c72a49405d672fdcf6ad65ab04271df56d37" Mar 09 16:38:48 crc kubenswrapper[4831]: I0309 16:38:48.863153 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ndh" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.304198 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs"] Mar 09 16:38:49 crc kubenswrapper[4831]: E0309 16:38:49.304592 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cede37-1909-455a-a57b-5295bdd44198" containerName="swift-ring-rebalance" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.304609 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cede37-1909-455a-a57b-5295bdd44198" containerName="swift-ring-rebalance" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.304792 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cede37-1909-455a-a57b-5295bdd44198" containerName="swift-ring-rebalance" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.305454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.310723 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.311469 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.320213 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs"] Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.401909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.401959 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.402092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.402218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.402271 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.402303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25jg\" (UniqueName: \"kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.503934 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504018 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g25jg\" (UniqueName: \"kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.504635 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.505006 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.505156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.508954 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.515324 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.537109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25jg\" (UniqueName: \"kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg\") pod \"swift-ring-rebalance-debug-6pzhs\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.626233 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03cede37-1909-455a-a57b-5295bdd44198" path="/var/lib/kubelet/pods/03cede37-1909-455a-a57b-5295bdd44198/volumes" Mar 09 16:38:49 crc kubenswrapper[4831]: I0309 16:38:49.632185 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:50 crc kubenswrapper[4831]: I0309 16:38:50.076594 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs"] Mar 09 16:38:50 crc kubenswrapper[4831]: I0309 16:38:50.884008 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" event={"ID":"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a","Type":"ContainerStarted","Data":"524d63931768dcc900caf2021f6419e5a7a7250906d856d8ae989e6819f00053"} Mar 09 16:38:50 crc kubenswrapper[4831]: I0309 16:38:50.884356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" event={"ID":"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a","Type":"ContainerStarted","Data":"777187def70e92e663d413d6abfeda8e6465baa7dc95263cdbadcb17998fdd75"} Mar 09 16:38:50 crc kubenswrapper[4831]: I0309 16:38:50.906937 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" podStartSLOduration=1.9069168250000001 podStartE2EDuration="1.906916825s" podCreationTimestamp="2026-03-09 16:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:50.901629405 +0000 UTC m=+2458.035311828" watchObservedRunningTime="2026-03-09 16:38:50.906916825 +0000 UTC m=+2458.040599258" Mar 09 16:38:51 crc kubenswrapper[4831]: I0309 16:38:51.893567 4831 generic.go:334] "Generic (PLEG): container finished" podID="21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" containerID="524d63931768dcc900caf2021f6419e5a7a7250906d856d8ae989e6819f00053" exitCode=0 Mar 09 16:38:51 crc kubenswrapper[4831]: I0309 16:38:51.893899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" event={"ID":"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a","Type":"ContainerDied","Data":"524d63931768dcc900caf2021f6419e5a7a7250906d856d8ae989e6819f00053"} Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.167245 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.206050 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs"] Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.212070 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs"] Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266581 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266640 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g25jg\" (UniqueName: \"kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266789 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.266816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift\") pod \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\" (UID: \"21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a\") " Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.268172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.268977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.286616 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg" (OuterVolumeSpecName: "kube-api-access-g25jg") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "kube-api-access-g25jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.288216 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts" (OuterVolumeSpecName: "scripts") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.292277 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.293011 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" (UID: "21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368435 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368479 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368491 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368504 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368517 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.368528 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g25jg\" (UniqueName: \"kubernetes.io/projected/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a-kube-api-access-g25jg\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.626897 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" path="/var/lib/kubelet/pods/21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a/volumes" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.910047 4831 scope.go:117] "RemoveContainer" containerID="524d63931768dcc900caf2021f6419e5a7a7250906d856d8ae989e6819f00053" Mar 09 16:38:53 crc kubenswrapper[4831]: I0309 16:38:53.910099 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6pzhs" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.380229 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq"] Mar 09 16:38:54 crc kubenswrapper[4831]: E0309 16:38:54.380906 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" containerName="swift-ring-rebalance" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.380921 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" containerName="swift-ring-rebalance" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.381097 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c9f0bd-e831-4e6f-94ef-4eeb74d60e2a" containerName="swift-ring-rebalance" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.381595 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.386854 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.387030 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.393528 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq"] Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.482856 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.482914 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.483117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.483309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg928\" (UniqueName: \"kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.483380 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.483482 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584432 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg928\" (UniqueName: \"kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584695 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.584726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.585584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.585952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.586127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.589861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.609050 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.610800 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg928\" (UniqueName: \"kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928\") pod \"swift-ring-rebalance-debug-gzhjq\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:54 crc kubenswrapper[4831]: I0309 16:38:54.753172 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:55 crc kubenswrapper[4831]: I0309 16:38:55.172476 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq"] Mar 09 16:38:55 crc kubenswrapper[4831]: I0309 16:38:55.934166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" event={"ID":"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d","Type":"ContainerStarted","Data":"0b5b68182e9990ddf4ff5add30c1f44f51b45bd71f821182bb07150e8ff8499c"} Mar 09 16:38:55 crc kubenswrapper[4831]: I0309 16:38:55.935735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" event={"ID":"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d","Type":"ContainerStarted","Data":"5854a6bb00855332a2c870433a5237495edc293c68ab6d906584de11ffdec451"} Mar 09 16:38:55 crc kubenswrapper[4831]: I0309 16:38:55.955921 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" podStartSLOduration=1.955904879 podStartE2EDuration="1.955904879s" podCreationTimestamp="2026-03-09 16:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:38:55.948950881 +0000 UTC m=+2463.082633334" watchObservedRunningTime="2026-03-09 16:38:55.955904879 +0000 UTC m=+2463.089587302" Mar 09 16:38:56 crc kubenswrapper[4831]: I0309 16:38:56.947853 4831 generic.go:334] "Generic (PLEG): container finished" podID="ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" containerID="0b5b68182e9990ddf4ff5add30c1f44f51b45bd71f821182bb07150e8ff8499c" exitCode=0 Mar 09 16:38:56 crc kubenswrapper[4831]: I0309 16:38:56.947913 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" event={"ID":"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d","Type":"ContainerDied","Data":"0b5b68182e9990ddf4ff5add30c1f44f51b45bd71f821182bb07150e8ff8499c"} Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.251633 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.285965 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq"] Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.291481 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq"] Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.345961 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.346318 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.346339 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.346370 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.346526 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg928\" (UniqueName: \"kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.346576 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts\") pod \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\" (UID: \"ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d\") " Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.347174 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.347233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.352076 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928" (OuterVolumeSpecName: "kube-api-access-kg928") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "kube-api-access-kg928". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.367220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts" (OuterVolumeSpecName: "scripts") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.370373 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.372652 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" (UID: "ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448675 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448708 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448718 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448730 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448740 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg928\" (UniqueName: \"kubernetes.io/projected/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-kube-api-access-kg928\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.448750 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.617777 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:38:58 crc kubenswrapper[4831]: E0309 16:38:58.617995 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.965667 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5854a6bb00855332a2c870433a5237495edc293c68ab6d906584de11ffdec451" Mar 09 16:38:58 crc kubenswrapper[4831]: I0309 16:38:58.965727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhjq" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.413023 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd"] Mar 09 16:38:59 crc kubenswrapper[4831]: E0309 16:38:59.413548 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" containerName="swift-ring-rebalance" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.413572 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" containerName="swift-ring-rebalance" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.413798 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" containerName="swift-ring-rebalance" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.414584 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.420254 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.420918 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.430363 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd"] Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463354 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463390 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463430 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463450 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.463536 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxp72\" (UniqueName: \"kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxp72\" (UniqueName: \"kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565885 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565912 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.565950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.566037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.567149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.567560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.569608 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.570202 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.585350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxp72\" (UniqueName: \"kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72\") pod \"swift-ring-rebalance-debug-zbkvd\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.627186 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d" path="/var/lib/kubelet/pods/ea5b6e68-a4f1-4ebc-bc7c-0a0c7774164d/volumes" Mar 09 16:38:59 crc kubenswrapper[4831]: I0309 16:38:59.732122 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:39:00 crc kubenswrapper[4831]: I0309 16:39:00.166724 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd"] Mar 09 16:39:00 crc kubenswrapper[4831]: W0309 16:39:00.176619 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a46f184_dae9_47e3_82f9_829b7acb3b7f.slice/crio-2899edc7215fc4f2ca4f76131143184e0f6af3c3723bbacdf96114cd9d729e11 WatchSource:0}: Error finding container 2899edc7215fc4f2ca4f76131143184e0f6af3c3723bbacdf96114cd9d729e11: Status 404 returned error can't find the container with id 2899edc7215fc4f2ca4f76131143184e0f6af3c3723bbacdf96114cd9d729e11 Mar 09 16:39:00 crc kubenswrapper[4831]: I0309 16:39:00.989033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" event={"ID":"1a46f184-dae9-47e3-82f9-829b7acb3b7f","Type":"ContainerStarted","Data":"8b9bbba76c47ac5bf99d14430a11cb250c94c5478ce01c543225407639ff2874"} Mar 09 16:39:00 crc kubenswrapper[4831]: I0309 16:39:00.989384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" event={"ID":"1a46f184-dae9-47e3-82f9-829b7acb3b7f","Type":"ContainerStarted","Data":"2899edc7215fc4f2ca4f76131143184e0f6af3c3723bbacdf96114cd9d729e11"} Mar 09 16:39:01 crc kubenswrapper[4831]: I0309 16:39:01.008649 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" podStartSLOduration=2.008632118 podStartE2EDuration="2.008632118s" podCreationTimestamp="2026-03-09 16:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:01.004260054 +0000 UTC m=+2468.137942477" watchObservedRunningTime="2026-03-09 16:39:01.008632118 +0000 UTC m=+2468.142314541" Mar 09 16:39:01 crc kubenswrapper[4831]: I0309 16:39:01.999633 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a46f184-dae9-47e3-82f9-829b7acb3b7f" containerID="8b9bbba76c47ac5bf99d14430a11cb250c94c5478ce01c543225407639ff2874" exitCode=0 Mar 09 16:39:01 crc kubenswrapper[4831]: I0309 16:39:01.999697 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" event={"ID":"1a46f184-dae9-47e3-82f9-829b7acb3b7f","Type":"ContainerDied","Data":"8b9bbba76c47ac5bf99d14430a11cb250c94c5478ce01c543225407639ff2874"} Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.313068 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.349595 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd"] Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.353648 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.353806 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.353864 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxp72\" (UniqueName: \"kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.353921 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.353990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.354059 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices\") pod \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\" (UID: \"1a46f184-dae9-47e3-82f9-829b7acb3b7f\") " Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.356006 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.356376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.362222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72" (OuterVolumeSpecName: "kube-api-access-jxp72") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "kube-api-access-jxp72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.362659 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd"] Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.377716 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts" (OuterVolumeSpecName: "scripts") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.380194 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.384929 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1a46f184-dae9-47e3-82f9-829b7acb3b7f" (UID: "1a46f184-dae9-47e3-82f9-829b7acb3b7f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.400301 4831 scope.go:117] "RemoveContainer" containerID="e2d9ec11d3ad904b054bcebfea6c1231955a2383528fefb1b6b1a50ae023e6af" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463302 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a46f184-dae9-47e3-82f9-829b7acb3b7f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463420 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463436 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxp72\" (UniqueName: \"kubernetes.io/projected/1a46f184-dae9-47e3-82f9-829b7acb3b7f-kube-api-access-jxp72\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463452 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463465 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a46f184-dae9-47e3-82f9-829b7acb3b7f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.463475 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a46f184-dae9-47e3-82f9-829b7acb3b7f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.465365 4831 scope.go:117] "RemoveContainer" containerID="f0e58a0ad94be442ea8ea9f6825dc25881be3e38ea4b5f3df96108f53d7cae92" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.492227 4831 scope.go:117] "RemoveContainer" containerID="a143a0caffcacb833733fd68c8aced486fe9ee7fd9f3f5b408341967a8b87283" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.514360 4831 scope.go:117] "RemoveContainer" containerID="2fcaece23124da21d8bd80082a67824c2138721de06ab9ccccf230f6c63a0960" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.538148 4831 scope.go:117] "RemoveContainer" containerID="d762e3b6903adff76b40bb6985da7a3a154ea2035025890ef8c80c117afb8fc3" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.570856 4831 scope.go:117] "RemoveContainer" containerID="bd86660f9d20b5a8679ee457db628d395d5d54ad4094cb56560fe4e5c3320971" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.599539 4831 scope.go:117] "RemoveContainer" containerID="2fbdc2fd7e117df0425a41149a54360e9bd8b963cf79b620954925cd693eebc0" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.626386 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a46f184-dae9-47e3-82f9-829b7acb3b7f" path="/var/lib/kubelet/pods/1a46f184-dae9-47e3-82f9-829b7acb3b7f/volumes" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.640350 4831 scope.go:117] "RemoveContainer" containerID="c91a527411a8f1d27d6316eb1f44eec430ae08ec957a8c34301ad0e02089fc69" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.669607 4831 scope.go:117] "RemoveContainer" containerID="81314b12cb02d56273ca18382d65297de17eeb58d0b021a430245c48d86ff4f6" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.698878 4831 scope.go:117] "RemoveContainer" containerID="1df6a6ba8f4388be88363a7276537f409bcf5575dd95acac3153458132da19e0" Mar 09 16:39:03 crc kubenswrapper[4831]: I0309 16:39:03.722374 4831 scope.go:117] "RemoveContainer" containerID="986737dea23a18e273b6264307453a8fb79870e7569b0330771533f5a3383a4a" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.020883 4831 scope.go:117] "RemoveContainer" containerID="8b9bbba76c47ac5bf99d14430a11cb250c94c5478ce01c543225407639ff2874" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.020889 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbkvd" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.485905 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bqr64"] Mar 09 16:39:04 crc kubenswrapper[4831]: E0309 16:39:04.486622 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a46f184-dae9-47e3-82f9-829b7acb3b7f" containerName="swift-ring-rebalance" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.486638 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a46f184-dae9-47e3-82f9-829b7acb3b7f" containerName="swift-ring-rebalance" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.486775 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a46f184-dae9-47e3-82f9-829b7acb3b7f" containerName="swift-ring-rebalance" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.487317 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.490202 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.490323 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.494169 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bqr64"] Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580741 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580798 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580897 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57rg\" (UniqueName: \"kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580924 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.580974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.682887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683005 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683032 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57rg\" (UniqueName: \"kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683078 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683204 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.683901 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.684144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.684740 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.690711 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.702192 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.703625 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57rg\" (UniqueName: \"kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg\") pod \"swift-ring-rebalance-debug-bqr64\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:04 crc kubenswrapper[4831]: I0309 16:39:04.801305 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:05 crc kubenswrapper[4831]: I0309 16:39:05.070732 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bqr64"] Mar 09 16:39:05 crc kubenswrapper[4831]: W0309 16:39:05.079054 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0367c6b9_5570_475f_b881_991da460c542.slice/crio-208c79d4b37205813df6e47d335d607050ad0df9466b6b2fc87d875de1c0ebaf WatchSource:0}: Error finding container 208c79d4b37205813df6e47d335d607050ad0df9466b6b2fc87d875de1c0ebaf: Status 404 returned error can't find the container with id 208c79d4b37205813df6e47d335d607050ad0df9466b6b2fc87d875de1c0ebaf Mar 09 16:39:06 crc kubenswrapper[4831]: I0309 16:39:06.041631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" event={"ID":"0367c6b9-5570-475f-b881-991da460c542","Type":"ContainerStarted","Data":"e96f3dc1086af4891161b15dffb55a5ea50c533520232ba195d6b56bf2be2a69"} Mar 09 16:39:06 crc kubenswrapper[4831]: I0309 16:39:06.041932 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" event={"ID":"0367c6b9-5570-475f-b881-991da460c542","Type":"ContainerStarted","Data":"208c79d4b37205813df6e47d335d607050ad0df9466b6b2fc87d875de1c0ebaf"} Mar 09 16:39:06 crc kubenswrapper[4831]: I0309 16:39:06.070096 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" podStartSLOduration=2.070075575 podStartE2EDuration="2.070075575s" podCreationTimestamp="2026-03-09 16:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:06.062879331 +0000 UTC m=+2473.196561754" watchObservedRunningTime="2026-03-09 16:39:06.070075575 +0000 UTC m=+2473.203758008" Mar 09 16:39:07 crc kubenswrapper[4831]: I0309 16:39:07.052639 4831 generic.go:334] "Generic (PLEG): container finished" podID="0367c6b9-5570-475f-b881-991da460c542" containerID="e96f3dc1086af4891161b15dffb55a5ea50c533520232ba195d6b56bf2be2a69" exitCode=0 Mar 09 16:39:07 crc kubenswrapper[4831]: I0309 16:39:07.052698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" event={"ID":"0367c6b9-5570-475f-b881-991da460c542","Type":"ContainerDied","Data":"e96f3dc1086af4891161b15dffb55a5ea50c533520232ba195d6b56bf2be2a69"} Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.362912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.401431 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bqr64"] Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.409363 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bqr64"] Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436437 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n57rg\" (UniqueName: \"kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436690 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.436759 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.438156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.438986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.448238 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg" (OuterVolumeSpecName: "kube-api-access-n57rg") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "kube-api-access-n57rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:08 crc kubenswrapper[4831]: E0309 16:39:08.467822 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf podName:0367c6b9-5570-475f-b881-991da460c542 nodeName:}" failed. No retries permitted until 2026-03-09 16:39:08.967769089 +0000 UTC m=+2476.101451512 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542") : error deleting /var/lib/kubelet/pods/0367c6b9-5570-475f-b881-991da460c542/volume-subpaths: remove /var/lib/kubelet/pods/0367c6b9-5570-475f-b881-991da460c542/volume-subpaths: no such file or directory Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.468264 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.469182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts" (OuterVolumeSpecName: "scripts") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.538294 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.538343 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.538352 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0367c6b9-5570-475f-b881-991da460c542-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.538361 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0367c6b9-5570-475f-b881-991da460c542-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:08 crc kubenswrapper[4831]: I0309 16:39:08.538370 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n57rg\" (UniqueName: \"kubernetes.io/projected/0367c6b9-5570-475f-b881-991da460c542-kube-api-access-n57rg\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.048600 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") pod \"0367c6b9-5570-475f-b881-991da460c542\" (UID: \"0367c6b9-5570-475f-b881-991da460c542\") " Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.053195 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0367c6b9-5570-475f-b881-991da460c542" (UID: "0367c6b9-5570-475f-b881-991da460c542"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.072418 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="208c79d4b37205813df6e47d335d607050ad0df9466b6b2fc87d875de1c0ebaf" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.072505 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bqr64" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.150350 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0367c6b9-5570-475f-b881-991da460c542-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.536641 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m"] Mar 09 16:39:09 crc kubenswrapper[4831]: E0309 16:39:09.537939 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0367c6b9-5570-475f-b881-991da460c542" containerName="swift-ring-rebalance" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.538018 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0367c6b9-5570-475f-b881-991da460c542" containerName="swift-ring-rebalance" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.538292 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0367c6b9-5570-475f-b881-991da460c542" containerName="swift-ring-rebalance" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.538969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.541112 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.542243 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.547054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m"] Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.627497 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0367c6b9-5570-475f-b881-991da460c542" path="/var/lib/kubelet/pods/0367c6b9-5570-475f-b881-991da460c542/volumes" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.657803 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xp75\" (UniqueName: \"kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.657865 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.658144 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.658310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.658353 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.658376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.759938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xp75\" (UniqueName: \"kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760195 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.760818 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.761313 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.761828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.764167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.764199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.775880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xp75\" (UniqueName: \"kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75\") pod \"swift-ring-rebalance-debug-cvw5m\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:09 crc kubenswrapper[4831]: I0309 16:39:09.900174 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:10 crc kubenswrapper[4831]: I0309 16:39:10.121418 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m"] Mar 09 16:39:10 crc kubenswrapper[4831]: W0309 16:39:10.123170 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9630eecb_1b10_428d_918b_0675d8f78007.slice/crio-816387ea64bc5df754c68ce0a2ede6e66d9efd3ae0a7462fcf15a2e52a084eb4 WatchSource:0}: Error finding container 816387ea64bc5df754c68ce0a2ede6e66d9efd3ae0a7462fcf15a2e52a084eb4: Status 404 returned error can't find the container with id 816387ea64bc5df754c68ce0a2ede6e66d9efd3ae0a7462fcf15a2e52a084eb4 Mar 09 16:39:11 crc kubenswrapper[4831]: I0309 16:39:11.102066 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" event={"ID":"9630eecb-1b10-428d-918b-0675d8f78007","Type":"ContainerStarted","Data":"04e661dd98701a72869edabf0a5e26b8c4f4e570118c46d98090f6655c733861"} Mar 09 16:39:11 crc kubenswrapper[4831]: I0309 16:39:11.102413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" event={"ID":"9630eecb-1b10-428d-918b-0675d8f78007","Type":"ContainerStarted","Data":"816387ea64bc5df754c68ce0a2ede6e66d9efd3ae0a7462fcf15a2e52a084eb4"} Mar 09 16:39:11 crc kubenswrapper[4831]: I0309 16:39:11.129813 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" podStartSLOduration=2.129792263 podStartE2EDuration="2.129792263s" podCreationTimestamp="2026-03-09 16:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:11.122473985 +0000 UTC m=+2478.256156408" watchObservedRunningTime="2026-03-09 16:39:11.129792263 +0000 UTC m=+2478.263474676" Mar 09 16:39:11 crc kubenswrapper[4831]: I0309 16:39:11.617277 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:39:11 crc kubenswrapper[4831]: E0309 16:39:11.617552 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:39:12 crc kubenswrapper[4831]: I0309 16:39:12.122594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" event={"ID":"9630eecb-1b10-428d-918b-0675d8f78007","Type":"ContainerDied","Data":"04e661dd98701a72869edabf0a5e26b8c4f4e570118c46d98090f6655c733861"} Mar 09 16:39:12 crc kubenswrapper[4831]: I0309 16:39:12.122565 4831 generic.go:334] "Generic (PLEG): container finished" podID="9630eecb-1b10-428d-918b-0675d8f78007" containerID="04e661dd98701a72869edabf0a5e26b8c4f4e570118c46d98090f6655c733861" exitCode=0 Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.416386 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.455888 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m"] Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.461058 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m"] Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514248 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514303 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xp75\" (UniqueName: \"kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514383 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.514467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf\") pod \"9630eecb-1b10-428d-918b-0675d8f78007\" (UID: \"9630eecb-1b10-428d-918b-0675d8f78007\") " Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.521114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.521381 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.532822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75" (OuterVolumeSpecName: "kube-api-access-9xp75") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "kube-api-access-9xp75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.535937 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts" (OuterVolumeSpecName: "scripts") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.539800 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.541621 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9630eecb-1b10-428d-918b-0675d8f78007" (UID: "9630eecb-1b10-428d-918b-0675d8f78007"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616271 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9630eecb-1b10-428d-918b-0675d8f78007-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616309 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616319 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616329 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9630eecb-1b10-428d-918b-0675d8f78007-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616343 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9630eecb-1b10-428d-918b-0675d8f78007-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.616356 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xp75\" (UniqueName: \"kubernetes.io/projected/9630eecb-1b10-428d-918b-0675d8f78007-kube-api-access-9xp75\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:13 crc kubenswrapper[4831]: I0309 16:39:13.630207 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9630eecb-1b10-428d-918b-0675d8f78007" path="/var/lib/kubelet/pods/9630eecb-1b10-428d-918b-0675d8f78007/volumes" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.144383 4831 scope.go:117] "RemoveContainer" containerID="04e661dd98701a72869edabf0a5e26b8c4f4e570118c46d98090f6655c733861" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.144442 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cvw5m" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.586594 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp"] Mar 09 16:39:14 crc kubenswrapper[4831]: E0309 16:39:14.586961 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9630eecb-1b10-428d-918b-0675d8f78007" containerName="swift-ring-rebalance" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.586978 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9630eecb-1b10-428d-918b-0675d8f78007" containerName="swift-ring-rebalance" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.587106 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9630eecb-1b10-428d-918b-0675d8f78007" containerName="swift-ring-rebalance" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.587636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.590066 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.590142 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.605065 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp"] Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631143 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvzd\" (UniqueName: \"kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631336 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.631385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732621 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732715 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvzd\" (UniqueName: \"kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.732879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.733506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.733637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.733816 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.743186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.743229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.756014 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvzd\" (UniqueName: \"kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd\") pod \"swift-ring-rebalance-debug-qnfqp\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:14 crc kubenswrapper[4831]: I0309 16:39:14.906041 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:15 crc kubenswrapper[4831]: I0309 16:39:15.116664 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp"] Mar 09 16:39:15 crc kubenswrapper[4831]: W0309 16:39:15.135112 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a19ee3f_1f87_460f_b92d_7792c6530f2f.slice/crio-b06327fcbfa39038a65f2e07c77d4fffa77496d72ffea883aa7d65ba3c981b08 WatchSource:0}: Error finding container b06327fcbfa39038a65f2e07c77d4fffa77496d72ffea883aa7d65ba3c981b08: Status 404 returned error can't find the container with id b06327fcbfa39038a65f2e07c77d4fffa77496d72ffea883aa7d65ba3c981b08 Mar 09 16:39:15 crc kubenswrapper[4831]: I0309 16:39:15.154765 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" event={"ID":"2a19ee3f-1f87-460f-b92d-7792c6530f2f","Type":"ContainerStarted","Data":"b06327fcbfa39038a65f2e07c77d4fffa77496d72ffea883aa7d65ba3c981b08"} Mar 09 16:39:16 crc kubenswrapper[4831]: I0309 16:39:16.164341 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" event={"ID":"2a19ee3f-1f87-460f-b92d-7792c6530f2f","Type":"ContainerStarted","Data":"55171b979ae27e87bf5e360569881bb48fccf8a6c6d00c1c3d0703518916f752"} Mar 09 16:39:16 crc kubenswrapper[4831]: I0309 16:39:16.188532 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" podStartSLOduration=2.188511433 podStartE2EDuration="2.188511433s" podCreationTimestamp="2026-03-09 16:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:16.178929361 +0000 UTC m=+2483.312611794" watchObservedRunningTime="2026-03-09 16:39:16.188511433 +0000 UTC m=+2483.322193856" Mar 09 16:39:17 crc kubenswrapper[4831]: I0309 16:39:17.175196 4831 generic.go:334] "Generic (PLEG): container finished" podID="2a19ee3f-1f87-460f-b92d-7792c6530f2f" containerID="55171b979ae27e87bf5e360569881bb48fccf8a6c6d00c1c3d0703518916f752" exitCode=0 Mar 09 16:39:17 crc kubenswrapper[4831]: I0309 16:39:17.175262 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" event={"ID":"2a19ee3f-1f87-460f-b92d-7792c6530f2f","Type":"ContainerDied","Data":"55171b979ae27e87bf5e360569881bb48fccf8a6c6d00c1c3d0703518916f752"} Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.516603 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.548336 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp"] Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.554705 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp"] Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587020 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587103 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdvzd\" (UniqueName: \"kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587193 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587240 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.587272 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf\") pod \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\" (UID: \"2a19ee3f-1f87-460f-b92d-7792c6530f2f\") " Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.588149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.588484 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.599622 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd" (OuterVolumeSpecName: "kube-api-access-pdvzd") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "kube-api-access-pdvzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.609626 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.610004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.611616 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts" (OuterVolumeSpecName: "scripts") pod "2a19ee3f-1f87-460f-b92d-7792c6530f2f" (UID: "2a19ee3f-1f87-460f-b92d-7792c6530f2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688536 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688596 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688608 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a19ee3f-1f87-460f-b92d-7792c6530f2f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688634 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a19ee3f-1f87-460f-b92d-7792c6530f2f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688644 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a19ee3f-1f87-460f-b92d-7792c6530f2f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:18 crc kubenswrapper[4831]: I0309 16:39:18.688655 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdvzd\" (UniqueName: \"kubernetes.io/projected/2a19ee3f-1f87-460f-b92d-7792c6530f2f-kube-api-access-pdvzd\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.194824 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06327fcbfa39038a65f2e07c77d4fffa77496d72ffea883aa7d65ba3c981b08" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.194880 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qnfqp" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.625238 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a19ee3f-1f87-460f-b92d-7792c6530f2f" path="/var/lib/kubelet/pods/2a19ee3f-1f87-460f-b92d-7792c6530f2f/volumes" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.694478 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj"] Mar 09 16:39:19 crc kubenswrapper[4831]: E0309 16:39:19.694818 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a19ee3f-1f87-460f-b92d-7792c6530f2f" containerName="swift-ring-rebalance" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.694841 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a19ee3f-1f87-460f-b92d-7792c6530f2f" containerName="swift-ring-rebalance" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.695050 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a19ee3f-1f87-460f-b92d-7792c6530f2f" containerName="swift-ring-rebalance" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.695657 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.697465 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.697583 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701287 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701332 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fk4\" (UniqueName: \"kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701439 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.701535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.702476 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj"] Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803296 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803318 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fk4\" (UniqueName: \"kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803341 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.803815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.804166 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.806032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.809301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.809314 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:19 crc kubenswrapper[4831]: I0309 16:39:19.819716 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fk4\" (UniqueName: \"kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4\") pod \"swift-ring-rebalance-debug-b2dwj\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:20 crc kubenswrapper[4831]: I0309 16:39:20.018782 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:20 crc kubenswrapper[4831]: I0309 16:39:20.481668 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj"] Mar 09 16:39:21 crc kubenswrapper[4831]: I0309 16:39:21.214209 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" event={"ID":"73aa850e-49ff-4038-863b-107a89465ca7","Type":"ContainerStarted","Data":"f43567a24a3de777b0a2630af2ea3200674d6a898ab15dbbc545f47ff5907b5c"} Mar 09 16:39:21 crc kubenswrapper[4831]: I0309 16:39:21.214483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" event={"ID":"73aa850e-49ff-4038-863b-107a89465ca7","Type":"ContainerStarted","Data":"5a8cabc30c37a5858c76604673554fb3569af7b36035162435913f8e74c32275"} Mar 09 16:39:21 crc kubenswrapper[4831]: I0309 16:39:21.232078 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" podStartSLOduration=2.232058632 podStartE2EDuration="2.232058632s" podCreationTimestamp="2026-03-09 16:39:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:21.22879727 +0000 UTC m=+2488.362479703" watchObservedRunningTime="2026-03-09 16:39:21.232058632 +0000 UTC m=+2488.365741075" Mar 09 16:39:22 crc kubenswrapper[4831]: I0309 16:39:22.223251 4831 generic.go:334] "Generic (PLEG): container finished" podID="73aa850e-49ff-4038-863b-107a89465ca7" containerID="f43567a24a3de777b0a2630af2ea3200674d6a898ab15dbbc545f47ff5907b5c" exitCode=0 Mar 09 16:39:22 crc kubenswrapper[4831]: I0309 16:39:22.223569 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" event={"ID":"73aa850e-49ff-4038-863b-107a89465ca7","Type":"ContainerDied","Data":"f43567a24a3de777b0a2630af2ea3200674d6a898ab15dbbc545f47ff5907b5c"} Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.545849 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.582354 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj"] Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.590396 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj"] Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.658729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.658842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.658958 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.659044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fk4\" (UniqueName: \"kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.659113 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.659145 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices\") pod \"73aa850e-49ff-4038-863b-107a89465ca7\" (UID: \"73aa850e-49ff-4038-863b-107a89465ca7\") " Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.659951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.660462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.667227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4" (OuterVolumeSpecName: "kube-api-access-d6fk4") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "kube-api-access-d6fk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.684117 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts" (OuterVolumeSpecName: "scripts") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.685295 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.685487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "73aa850e-49ff-4038-863b-107a89465ca7" (UID: "73aa850e-49ff-4038-863b-107a89465ca7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761374 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761451 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761463 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73aa850e-49ff-4038-863b-107a89465ca7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761474 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73aa850e-49ff-4038-863b-107a89465ca7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761487 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6fk4\" (UniqueName: \"kubernetes.io/projected/73aa850e-49ff-4038-863b-107a89465ca7-kube-api-access-d6fk4\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:23 crc kubenswrapper[4831]: I0309 16:39:23.761499 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73aa850e-49ff-4038-863b-107a89465ca7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.244318 4831 scope.go:117] "RemoveContainer" containerID="f43567a24a3de777b0a2630af2ea3200674d6a898ab15dbbc545f47ff5907b5c" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.244353 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b2dwj" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.617749 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:39:24 crc kubenswrapper[4831]: E0309 16:39:24.617982 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.709263 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2j968"] Mar 09 16:39:24 crc kubenswrapper[4831]: E0309 16:39:24.709895 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73aa850e-49ff-4038-863b-107a89465ca7" containerName="swift-ring-rebalance" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.709921 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="73aa850e-49ff-4038-863b-107a89465ca7" containerName="swift-ring-rebalance" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.710081 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="73aa850e-49ff-4038-863b-107a89465ca7" containerName="swift-ring-rebalance" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.710647 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.712364 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.712663 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.720115 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2j968"] Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.877622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.878039 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4fln\" (UniqueName: \"kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.878217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.878321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.878423 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.878652 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980586 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980649 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4fln\" (UniqueName: \"kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980690 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.980799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.981771 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.982188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.982588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.985979 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:24 crc kubenswrapper[4831]: I0309 16:39:24.990736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:25 crc kubenswrapper[4831]: I0309 16:39:25.004863 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4fln\" (UniqueName: \"kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln\") pod \"swift-ring-rebalance-debug-2j968\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:25 crc kubenswrapper[4831]: I0309 16:39:25.028236 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:25 crc kubenswrapper[4831]: I0309 16:39:25.433015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2j968"] Mar 09 16:39:25 crc kubenswrapper[4831]: I0309 16:39:25.626585 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73aa850e-49ff-4038-863b-107a89465ca7" path="/var/lib/kubelet/pods/73aa850e-49ff-4038-863b-107a89465ca7/volumes" Mar 09 16:39:26 crc kubenswrapper[4831]: I0309 16:39:26.294685 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" event={"ID":"4b1556e4-20aa-4708-81ee-17c22b7aa5f2","Type":"ContainerStarted","Data":"b5c20f460aaa00f0935c3fdcf1cda6848dea6e986b1db5cdb2b148b74841fbf2"} Mar 09 16:39:26 crc kubenswrapper[4831]: I0309 16:39:26.295039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" event={"ID":"4b1556e4-20aa-4708-81ee-17c22b7aa5f2","Type":"ContainerStarted","Data":"ce11dd14ab32e057f120dd48d0612afa3e2aaf5b80a32f828592f9e0ab556067"} Mar 09 16:39:26 crc kubenswrapper[4831]: I0309 16:39:26.322079 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" podStartSLOduration=2.32205898 podStartE2EDuration="2.32205898s" podCreationTimestamp="2026-03-09 16:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:26.308710551 +0000 UTC m=+2493.442392974" watchObservedRunningTime="2026-03-09 16:39:26.32205898 +0000 UTC m=+2493.455741403" Mar 09 16:39:27 crc kubenswrapper[4831]: I0309 16:39:27.306383 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b1556e4-20aa-4708-81ee-17c22b7aa5f2" containerID="b5c20f460aaa00f0935c3fdcf1cda6848dea6e986b1db5cdb2b148b74841fbf2" exitCode=0 Mar 09 16:39:27 crc kubenswrapper[4831]: I0309 16:39:27.306469 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" event={"ID":"4b1556e4-20aa-4708-81ee-17c22b7aa5f2","Type":"ContainerDied","Data":"b5c20f460aaa00f0935c3fdcf1cda6848dea6e986b1db5cdb2b148b74841fbf2"} Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.604720 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.646547 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2j968"] Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.656072 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2j968"] Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.740818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.740866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.740906 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.740927 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.740976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.741010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4fln\" (UniqueName: \"kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln\") pod \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\" (UID: \"4b1556e4-20aa-4708-81ee-17c22b7aa5f2\") " Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.741753 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.742064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.749575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln" (OuterVolumeSpecName: "kube-api-access-b4fln") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "kube-api-access-b4fln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.762978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts" (OuterVolumeSpecName: "scripts") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.764903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.765061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b1556e4-20aa-4708-81ee-17c22b7aa5f2" (UID: "4b1556e4-20aa-4708-81ee-17c22b7aa5f2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.842954 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.843001 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.843019 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.843036 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.843054 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:28 crc kubenswrapper[4831]: I0309 16:39:28.843075 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4fln\" (UniqueName: \"kubernetes.io/projected/4b1556e4-20aa-4708-81ee-17c22b7aa5f2-kube-api-access-b4fln\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.325530 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce11dd14ab32e057f120dd48d0612afa3e2aaf5b80a32f828592f9e0ab556067" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.325669 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2j968" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.626388 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1556e4-20aa-4708-81ee-17c22b7aa5f2" path="/var/lib/kubelet/pods/4b1556e4-20aa-4708-81ee-17c22b7aa5f2/volumes" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.790262 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n24rx"] Mar 09 16:39:29 crc kubenswrapper[4831]: E0309 16:39:29.790545 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1556e4-20aa-4708-81ee-17c22b7aa5f2" containerName="swift-ring-rebalance" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.790563 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1556e4-20aa-4708-81ee-17c22b7aa5f2" containerName="swift-ring-rebalance" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.790702 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1556e4-20aa-4708-81ee-17c22b7aa5f2" containerName="swift-ring-rebalance" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.791205 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.793439 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.795802 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.802457 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n24rx"] Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.857687 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.857739 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.857902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.857928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.857967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxfm\" (UniqueName: \"kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.858019 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.959926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960345 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxfm\" (UniqueName: \"kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960673 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.960993 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.961283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.961505 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.965074 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.965124 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:29 crc kubenswrapper[4831]: I0309 16:39:29.983063 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxfm\" (UniqueName: \"kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm\") pod \"swift-ring-rebalance-debug-n24rx\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:30 crc kubenswrapper[4831]: I0309 16:39:30.115523 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:30 crc kubenswrapper[4831]: I0309 16:39:30.596511 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n24rx"] Mar 09 16:39:31 crc kubenswrapper[4831]: I0309 16:39:31.344159 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" event={"ID":"675f2d45-4a21-44c7-a4fa-471e4f3763c4","Type":"ContainerStarted","Data":"8f0fba3b235b0546ef92b382d368ad82cd85e01858224f5d11c596606d6f1075"} Mar 09 16:39:31 crc kubenswrapper[4831]: I0309 16:39:31.345840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" event={"ID":"675f2d45-4a21-44c7-a4fa-471e4f3763c4","Type":"ContainerStarted","Data":"6ff903c97107dbb7dd5b8e9938ed6bfcd23a52a31f29530b4e312c24750274b8"} Mar 09 16:39:31 crc kubenswrapper[4831]: I0309 16:39:31.362166 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" podStartSLOduration=2.362138532 podStartE2EDuration="2.362138532s" podCreationTimestamp="2026-03-09 16:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:31.360422283 +0000 UTC m=+2498.494104706" watchObservedRunningTime="2026-03-09 16:39:31.362138532 +0000 UTC m=+2498.495820955" Mar 09 16:39:32 crc kubenswrapper[4831]: I0309 16:39:32.362317 4831 generic.go:334] "Generic (PLEG): container finished" podID="675f2d45-4a21-44c7-a4fa-471e4f3763c4" containerID="8f0fba3b235b0546ef92b382d368ad82cd85e01858224f5d11c596606d6f1075" exitCode=0 Mar 09 16:39:32 crc kubenswrapper[4831]: I0309 16:39:32.362367 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" event={"ID":"675f2d45-4a21-44c7-a4fa-471e4f3763c4","Type":"ContainerDied","Data":"8f0fba3b235b0546ef92b382d368ad82cd85e01858224f5d11c596606d6f1075"} Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.632789 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.664478 4831 status_manager.go:875] "Failed to update status for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"675f2d45-4a21-44c7-a4fa-471e4f3763c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T16:39:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"}]}}\" for pod \"swift-kuttl-tests\"/\"swift-ring-rebalance-debug-n24rx\": pods \"swift-ring-rebalance-debug-n24rx\" not found" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.676123 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n24rx"] Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.686564 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n24rx"] Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.713813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.713884 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.713913 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.713957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.714049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.714092 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbxfm\" (UniqueName: \"kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm\") pod \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\" (UID: \"675f2d45-4a21-44c7-a4fa-471e4f3763c4\") " Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.716041 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.716636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.719574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm" (OuterVolumeSpecName: "kube-api-access-bbxfm") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "kube-api-access-bbxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.734180 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.738390 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.745120 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts" (OuterVolumeSpecName: "scripts") pod "675f2d45-4a21-44c7-a4fa-471e4f3763c4" (UID: "675f2d45-4a21-44c7-a4fa-471e4f3763c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816230 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816264 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbxfm\" (UniqueName: \"kubernetes.io/projected/675f2d45-4a21-44c7-a4fa-471e4f3763c4-kube-api-access-bbxfm\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816278 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/675f2d45-4a21-44c7-a4fa-471e4f3763c4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816289 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816299 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/675f2d45-4a21-44c7-a4fa-471e4f3763c4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:33 crc kubenswrapper[4831]: I0309 16:39:33.816309 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/675f2d45-4a21-44c7-a4fa-471e4f3763c4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.381064 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff903c97107dbb7dd5b8e9938ed6bfcd23a52a31f29530b4e312c24750274b8" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.381137 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n24rx" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.800390 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl"] Mar 09 16:39:34 crc kubenswrapper[4831]: E0309 16:39:34.800730 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675f2d45-4a21-44c7-a4fa-471e4f3763c4" containerName="swift-ring-rebalance" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.800741 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="675f2d45-4a21-44c7-a4fa-471e4f3763c4" containerName="swift-ring-rebalance" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.800880 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="675f2d45-4a21-44c7-a4fa-471e4f3763c4" containerName="swift-ring-rebalance" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.801330 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.803500 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.804149 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.817532 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl"] Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933501 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933559 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933587 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7v6c\" (UniqueName: \"kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933700 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:34 crc kubenswrapper[4831]: I0309 16:39:34.933728 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035128 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035215 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035248 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.035270 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7v6c\" (UniqueName: \"kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.036329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.036498 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.036587 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.040487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.041858 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.061234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7v6c\" (UniqueName: \"kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c\") pod \"swift-ring-rebalance-debug-6cbkl\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.131990 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.569658 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl"] Mar 09 16:39:35 crc kubenswrapper[4831]: W0309 16:39:35.574629 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1e8e57_2397_4da2_ae53_d9174e057988.slice/crio-9eb34c319edf0877dc46727895f4ad6a6facef7245183ec09617446d50425339 WatchSource:0}: Error finding container 9eb34c319edf0877dc46727895f4ad6a6facef7245183ec09617446d50425339: Status 404 returned error can't find the container with id 9eb34c319edf0877dc46727895f4ad6a6facef7245183ec09617446d50425339 Mar 09 16:39:35 crc kubenswrapper[4831]: I0309 16:39:35.626313 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675f2d45-4a21-44c7-a4fa-471e4f3763c4" path="/var/lib/kubelet/pods/675f2d45-4a21-44c7-a4fa-471e4f3763c4/volumes" Mar 09 16:39:36 crc kubenswrapper[4831]: I0309 16:39:36.398135 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" event={"ID":"6d1e8e57-2397-4da2-ae53-d9174e057988","Type":"ContainerStarted","Data":"865c6fcd4f171ef7e17c208c7a45f30164ffd3f76d1f925507b6b2e83f3d19e1"} Mar 09 16:39:36 crc kubenswrapper[4831]: I0309 16:39:36.398186 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" event={"ID":"6d1e8e57-2397-4da2-ae53-d9174e057988","Type":"ContainerStarted","Data":"9eb34c319edf0877dc46727895f4ad6a6facef7245183ec09617446d50425339"} Mar 09 16:39:36 crc kubenswrapper[4831]: I0309 16:39:36.420153 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" podStartSLOduration=2.42013881 podStartE2EDuration="2.42013881s" podCreationTimestamp="2026-03-09 16:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:36.414097739 +0000 UTC m=+2503.547780182" watchObservedRunningTime="2026-03-09 16:39:36.42013881 +0000 UTC m=+2503.553821233" Mar 09 16:39:36 crc kubenswrapper[4831]: I0309 16:39:36.617756 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:39:36 crc kubenswrapper[4831]: E0309 16:39:36.618002 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:39:37 crc kubenswrapper[4831]: I0309 16:39:37.408874 4831 generic.go:334] "Generic (PLEG): container finished" podID="6d1e8e57-2397-4da2-ae53-d9174e057988" containerID="865c6fcd4f171ef7e17c208c7a45f30164ffd3f76d1f925507b6b2e83f3d19e1" exitCode=0 Mar 09 16:39:37 crc kubenswrapper[4831]: I0309 16:39:37.409261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" event={"ID":"6d1e8e57-2397-4da2-ae53-d9174e057988","Type":"ContainerDied","Data":"865c6fcd4f171ef7e17c208c7a45f30164ffd3f76d1f925507b6b2e83f3d19e1"} Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.728296 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.769257 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl"] Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.774928 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl"] Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792215 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792315 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7v6c\" (UniqueName: \"kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792422 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792457 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.792492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf\") pod \"6d1e8e57-2397-4da2-ae53-d9174e057988\" (UID: \"6d1e8e57-2397-4da2-ae53-d9174e057988\") " Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.793281 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.793448 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.797874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c" (OuterVolumeSpecName: "kube-api-access-n7v6c") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "kube-api-access-n7v6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.814067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts" (OuterVolumeSpecName: "scripts") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.816026 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.818978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6d1e8e57-2397-4da2-ae53-d9174e057988" (UID: "6d1e8e57-2397-4da2-ae53-d9174e057988"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894165 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894383 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d1e8e57-2397-4da2-ae53-d9174e057988-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894470 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7v6c\" (UniqueName: \"kubernetes.io/projected/6d1e8e57-2397-4da2-ae53-d9174e057988-kube-api-access-n7v6c\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894554 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894609 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d1e8e57-2397-4da2-ae53-d9174e057988-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:38 crc kubenswrapper[4831]: I0309 16:39:38.894666 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d1e8e57-2397-4da2-ae53-d9174e057988-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.425470 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb34c319edf0877dc46727895f4ad6a6facef7245183ec09617446d50425339" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.425533 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6cbkl" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.625715 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1e8e57-2397-4da2-ae53-d9174e057988" path="/var/lib/kubelet/pods/6d1e8e57-2397-4da2-ae53-d9174e057988/volumes" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.897082 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn"] Mar 09 16:39:39 crc kubenswrapper[4831]: E0309 16:39:39.897388 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1e8e57-2397-4da2-ae53-d9174e057988" containerName="swift-ring-rebalance" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.897422 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1e8e57-2397-4da2-ae53-d9174e057988" containerName="swift-ring-rebalance" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.897553 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1e8e57-2397-4da2-ae53-d9174e057988" containerName="swift-ring-rebalance" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.897982 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.901482 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.902282 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:39 crc kubenswrapper[4831]: I0309 16:39:39.929512 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn"] Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013639 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013703 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r98lr\" (UniqueName: \"kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.013826 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115578 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115687 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r98lr\" (UniqueName: \"kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115785 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.115868 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.116541 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.116673 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.117174 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.119420 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.119845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.136063 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r98lr\" (UniqueName: \"kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr\") pod \"swift-ring-rebalance-debug-xhmbn\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.214751 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:40 crc kubenswrapper[4831]: I0309 16:39:40.693708 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn"] Mar 09 16:39:41 crc kubenswrapper[4831]: I0309 16:39:41.447515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" event={"ID":"7ce2c71b-68f7-4aec-92bc-7fa879c28057","Type":"ContainerStarted","Data":"b1d3a40cdf7488df7cef2675cbc40c911ec3aba1dec209921244b93184350dbc"} Mar 09 16:39:41 crc kubenswrapper[4831]: I0309 16:39:41.447584 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" event={"ID":"7ce2c71b-68f7-4aec-92bc-7fa879c28057","Type":"ContainerStarted","Data":"fba687998af9ac8b0582991b7d1821994702716f53dec72352186929eaf3eb9b"} Mar 09 16:39:41 crc kubenswrapper[4831]: I0309 16:39:41.468599 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" podStartSLOduration=2.4685759689999998 podStartE2EDuration="2.468575969s" podCreationTimestamp="2026-03-09 16:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:41.465653096 +0000 UTC m=+2508.599335549" watchObservedRunningTime="2026-03-09 16:39:41.468575969 +0000 UTC m=+2508.602258412" Mar 09 16:39:42 crc kubenswrapper[4831]: I0309 16:39:42.458450 4831 generic.go:334] "Generic (PLEG): container finished" podID="7ce2c71b-68f7-4aec-92bc-7fa879c28057" containerID="b1d3a40cdf7488df7cef2675cbc40c911ec3aba1dec209921244b93184350dbc" exitCode=0 Mar 09 16:39:42 crc kubenswrapper[4831]: I0309 16:39:42.458734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" event={"ID":"7ce2c71b-68f7-4aec-92bc-7fa879c28057","Type":"ContainerDied","Data":"b1d3a40cdf7488df7cef2675cbc40c911ec3aba1dec209921244b93184350dbc"} Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.735348 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.767082 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn"] Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.774921 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn"] Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.898803 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.898903 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.898980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r98lr\" (UniqueName: \"kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.899029 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.899055 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.899121 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift\") pod \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\" (UID: \"7ce2c71b-68f7-4aec-92bc-7fa879c28057\") " Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.900559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.900729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.908566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr" (OuterVolumeSpecName: "kube-api-access-r98lr") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "kube-api-access-r98lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.923565 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.923782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:43 crc kubenswrapper[4831]: I0309 16:39:43.934917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts" (OuterVolumeSpecName: "scripts") pod "7ce2c71b-68f7-4aec-92bc-7fa879c28057" (UID: "7ce2c71b-68f7-4aec-92bc-7fa879c28057"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000782 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000816 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000831 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r98lr\" (UniqueName: \"kubernetes.io/projected/7ce2c71b-68f7-4aec-92bc-7fa879c28057-kube-api-access-r98lr\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000843 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7ce2c71b-68f7-4aec-92bc-7fa879c28057-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000853 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7ce2c71b-68f7-4aec-92bc-7fa879c28057-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.000862 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7ce2c71b-68f7-4aec-92bc-7fa879c28057-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.479607 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fba687998af9ac8b0582991b7d1821994702716f53dec72352186929eaf3eb9b" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.479676 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhmbn" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.937542 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w"] Mar 09 16:39:44 crc kubenswrapper[4831]: E0309 16:39:44.938808 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce2c71b-68f7-4aec-92bc-7fa879c28057" containerName="swift-ring-rebalance" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.938911 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce2c71b-68f7-4aec-92bc-7fa879c28057" containerName="swift-ring-rebalance" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.939109 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce2c71b-68f7-4aec-92bc-7fa879c28057" containerName="swift-ring-rebalance" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.939688 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.941704 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.945530 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:44 crc kubenswrapper[4831]: I0309 16:39:44.949137 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w"] Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.024588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52mx\" (UniqueName: \"kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.024679 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.024720 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.024858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.024954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.025024 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126069 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52mx\" (UniqueName: \"kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.126552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.127107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.127173 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.130107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.130693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.144199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52mx\" (UniqueName: \"kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx\") pod \"swift-ring-rebalance-debug-5vt7w\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.261920 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.515086 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w"] Mar 09 16:39:45 crc kubenswrapper[4831]: W0309 16:39:45.525698 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod581d4150_74ee_4553_b1a2_1a8ae84441df.slice/crio-d51c9a1343ec9cae8301d062695e0dc760cafed37b99d32e55cd679ffdd289a8 WatchSource:0}: Error finding container d51c9a1343ec9cae8301d062695e0dc760cafed37b99d32e55cd679ffdd289a8: Status 404 returned error can't find the container with id d51c9a1343ec9cae8301d062695e0dc760cafed37b99d32e55cd679ffdd289a8 Mar 09 16:39:45 crc kubenswrapper[4831]: I0309 16:39:45.626522 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce2c71b-68f7-4aec-92bc-7fa879c28057" path="/var/lib/kubelet/pods/7ce2c71b-68f7-4aec-92bc-7fa879c28057/volumes" Mar 09 16:39:46 crc kubenswrapper[4831]: I0309 16:39:46.506297 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" event={"ID":"581d4150-74ee-4553-b1a2-1a8ae84441df","Type":"ContainerStarted","Data":"284632c86a0d7459fdc9909e8f4a9046e6f39a0d6ebd4666495255014b0f875a"} Mar 09 16:39:46 crc kubenswrapper[4831]: I0309 16:39:46.506632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" event={"ID":"581d4150-74ee-4553-b1a2-1a8ae84441df","Type":"ContainerStarted","Data":"d51c9a1343ec9cae8301d062695e0dc760cafed37b99d32e55cd679ffdd289a8"} Mar 09 16:39:46 crc kubenswrapper[4831]: I0309 16:39:46.525610 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" podStartSLOduration=2.52559363 podStartE2EDuration="2.52559363s" podCreationTimestamp="2026-03-09 16:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:46.522370178 +0000 UTC m=+2513.656052601" watchObservedRunningTime="2026-03-09 16:39:46.52559363 +0000 UTC m=+2513.659276043" Mar 09 16:39:47 crc kubenswrapper[4831]: I0309 16:39:47.517425 4831 generic.go:334] "Generic (PLEG): container finished" podID="581d4150-74ee-4553-b1a2-1a8ae84441df" containerID="284632c86a0d7459fdc9909e8f4a9046e6f39a0d6ebd4666495255014b0f875a" exitCode=0 Mar 09 16:39:47 crc kubenswrapper[4831]: I0309 16:39:47.517517 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" event={"ID":"581d4150-74ee-4553-b1a2-1a8ae84441df","Type":"ContainerDied","Data":"284632c86a0d7459fdc9909e8f4a9046e6f39a0d6ebd4666495255014b0f875a"} Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.774216 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.805255 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w"] Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.810018 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w"] Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.883372 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.883728 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52mx\" (UniqueName: \"kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.884238 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.884754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.884897 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.884972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.885052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf\") pod \"581d4150-74ee-4553-b1a2-1a8ae84441df\" (UID: \"581d4150-74ee-4553-b1a2-1a8ae84441df\") " Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.885523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.885675 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.885689 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/581d4150-74ee-4553-b1a2-1a8ae84441df-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.890993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx" (OuterVolumeSpecName: "kube-api-access-d52mx") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "kube-api-access-d52mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.908820 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.910114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.912083 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts" (OuterVolumeSpecName: "scripts") pod "581d4150-74ee-4553-b1a2-1a8ae84441df" (UID: "581d4150-74ee-4553-b1a2-1a8ae84441df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.986958 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.987003 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52mx\" (UniqueName: \"kubernetes.io/projected/581d4150-74ee-4553-b1a2-1a8ae84441df-kube-api-access-d52mx\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.987016 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/581d4150-74ee-4553-b1a2-1a8ae84441df-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:48 crc kubenswrapper[4831]: I0309 16:39:48.987025 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/581d4150-74ee-4553-b1a2-1a8ae84441df-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.538865 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51c9a1343ec9cae8301d062695e0dc760cafed37b99d32e55cd679ffdd289a8" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.538900 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5vt7w" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.628654 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581d4150-74ee-4553-b1a2-1a8ae84441df" path="/var/lib/kubelet/pods/581d4150-74ee-4553-b1a2-1a8ae84441df/volumes" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.958194 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v82rs"] Mar 09 16:39:49 crc kubenswrapper[4831]: E0309 16:39:49.958622 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581d4150-74ee-4553-b1a2-1a8ae84441df" containerName="swift-ring-rebalance" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.958639 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="581d4150-74ee-4553-b1a2-1a8ae84441df" containerName="swift-ring-rebalance" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.958830 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="581d4150-74ee-4553-b1a2-1a8ae84441df" containerName="swift-ring-rebalance" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.959487 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.961260 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.962278 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:49 crc kubenswrapper[4831]: I0309 16:39:49.968521 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v82rs"] Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2n9\" (UniqueName: \"kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.105760 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.207574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.207951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208317 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2n9\" (UniqueName: \"kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.208869 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.209305 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.212649 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.212695 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.236037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2n9\" (UniqueName: \"kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9\") pod \"swift-ring-rebalance-debug-v82rs\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.277341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.462018 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v82rs"] Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.549376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" event={"ID":"dae97b09-708a-4649-861f-cafc163859c9","Type":"ContainerStarted","Data":"0effe2b1a360dd8c53bb9158de8083fcac397d91d97ea7f76066a2b5d40587e0"} Mar 09 16:39:50 crc kubenswrapper[4831]: I0309 16:39:50.617358 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:39:50 crc kubenswrapper[4831]: E0309 16:39:50.618026 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:39:51 crc kubenswrapper[4831]: I0309 16:39:51.558054 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" event={"ID":"dae97b09-708a-4649-861f-cafc163859c9","Type":"ContainerStarted","Data":"e01b8603f7f1d8069c7f35994516468940ecd043607350a51bdd5e83e6ecd534"} Mar 09 16:39:51 crc kubenswrapper[4831]: I0309 16:39:51.576007 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" podStartSLOduration=2.5759883439999998 podStartE2EDuration="2.575988344s" podCreationTimestamp="2026-03-09 16:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:51.571323132 +0000 UTC m=+2518.705005565" watchObservedRunningTime="2026-03-09 16:39:51.575988344 +0000 UTC m=+2518.709670767" Mar 09 16:39:52 crc kubenswrapper[4831]: I0309 16:39:52.585525 4831 generic.go:334] "Generic (PLEG): container finished" podID="dae97b09-708a-4649-861f-cafc163859c9" containerID="e01b8603f7f1d8069c7f35994516468940ecd043607350a51bdd5e83e6ecd534" exitCode=0 Mar 09 16:39:52 crc kubenswrapper[4831]: I0309 16:39:52.585616 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" event={"ID":"dae97b09-708a-4649-861f-cafc163859c9","Type":"ContainerDied","Data":"e01b8603f7f1d8069c7f35994516468940ecd043607350a51bdd5e83e6ecd534"} Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.874845 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.913332 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v82rs"] Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.922713 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v82rs"] Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968646 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968741 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968880 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql2n9\" (UniqueName: \"kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968927 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.968976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift\") pod \"dae97b09-708a-4649-861f-cafc163859c9\" (UID: \"dae97b09-708a-4649-861f-cafc163859c9\") " Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.969692 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.969813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.980144 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9" (OuterVolumeSpecName: "kube-api-access-ql2n9") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "kube-api-access-ql2n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.989000 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts" (OuterVolumeSpecName: "scripts") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:53 crc kubenswrapper[4831]: I0309 16:39:53.990902 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.000545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dae97b09-708a-4649-861f-cafc163859c9" (UID: "dae97b09-708a-4649-861f-cafc163859c9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070635 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070674 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070685 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql2n9\" (UniqueName: \"kubernetes.io/projected/dae97b09-708a-4649-861f-cafc163859c9-kube-api-access-ql2n9\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070696 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dae97b09-708a-4649-861f-cafc163859c9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070705 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dae97b09-708a-4649-861f-cafc163859c9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.070713 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dae97b09-708a-4649-861f-cafc163859c9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.608703 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0effe2b1a360dd8c53bb9158de8083fcac397d91d97ea7f76066a2b5d40587e0" Mar 09 16:39:54 crc kubenswrapper[4831]: I0309 16:39:54.608814 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v82rs" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.071132 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz"] Mar 09 16:39:55 crc kubenswrapper[4831]: E0309 16:39:55.071729 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae97b09-708a-4649-861f-cafc163859c9" containerName="swift-ring-rebalance" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.071743 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae97b09-708a-4649-861f-cafc163859c9" containerName="swift-ring-rebalance" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.071886 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae97b09-708a-4649-861f-cafc163859c9" containerName="swift-ring-rebalance" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.072333 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.074343 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.074630 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.085221 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz"] Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.186848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.186960 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ksd\" (UniqueName: \"kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.187002 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.187045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.187070 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.187338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289385 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5ksd\" (UniqueName: \"kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289525 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.289564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.290203 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.290775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.290858 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.303680 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.304898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.308716 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5ksd\" (UniqueName: \"kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd\") pod \"swift-ring-rebalance-debug-v8hxz\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.389879 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.626307 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae97b09-708a-4649-861f-cafc163859c9" path="/var/lib/kubelet/pods/dae97b09-708a-4649-861f-cafc163859c9/volumes" Mar 09 16:39:55 crc kubenswrapper[4831]: I0309 16:39:55.829037 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz"] Mar 09 16:39:56 crc kubenswrapper[4831]: I0309 16:39:56.631903 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" event={"ID":"e7f5af38-731d-4da0-9018-bdf866f51624","Type":"ContainerStarted","Data":"9e0b9594ea256e33c76bee00bf5c7bf1f4a65a12081795538757fb2e2fdd2433"} Mar 09 16:39:56 crc kubenswrapper[4831]: I0309 16:39:56.632320 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" event={"ID":"e7f5af38-731d-4da0-9018-bdf866f51624","Type":"ContainerStarted","Data":"c5e81eb2769374e9eba30a56d79eef745f1530d0c6e1bc9e6205c8aafb7d7ae1"} Mar 09 16:39:56 crc kubenswrapper[4831]: I0309 16:39:56.658937 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" podStartSLOduration=1.658911631 podStartE2EDuration="1.658911631s" podCreationTimestamp="2026-03-09 16:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:39:56.648041501 +0000 UTC m=+2523.781723924" watchObservedRunningTime="2026-03-09 16:39:56.658911631 +0000 UTC m=+2523.792594064" Mar 09 16:39:57 crc kubenswrapper[4831]: I0309 16:39:57.642457 4831 generic.go:334] "Generic (PLEG): container finished" podID="e7f5af38-731d-4da0-9018-bdf866f51624" containerID="9e0b9594ea256e33c76bee00bf5c7bf1f4a65a12081795538757fb2e2fdd2433" exitCode=0 Mar 09 16:39:57 crc kubenswrapper[4831]: I0309 16:39:57.642565 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" event={"ID":"e7f5af38-731d-4da0-9018-bdf866f51624","Type":"ContainerDied","Data":"9e0b9594ea256e33c76bee00bf5c7bf1f4a65a12081795538757fb2e2fdd2433"} Mar 09 16:39:58 crc kubenswrapper[4831]: I0309 16:39:58.915061 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:39:58 crc kubenswrapper[4831]: I0309 16:39:58.942822 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz"] Mar 09 16:39:58 crc kubenswrapper[4831]: I0309 16:39:58.948490 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz"] Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.045999 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5ksd\" (UniqueName: \"kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.046341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.046509 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.046657 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.046819 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.046971 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift\") pod \"e7f5af38-731d-4da0-9018-bdf866f51624\" (UID: \"e7f5af38-731d-4da0-9018-bdf866f51624\") " Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.047222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.047579 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.047687 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.058671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd" (OuterVolumeSpecName: "kube-api-access-b5ksd") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "kube-api-access-b5ksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.069208 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts" (OuterVolumeSpecName: "scripts") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.071309 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.072670 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e7f5af38-731d-4da0-9018-bdf866f51624" (UID: "e7f5af38-731d-4da0-9018-bdf866f51624"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.149362 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7f5af38-731d-4da0-9018-bdf866f51624-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.149422 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7f5af38-731d-4da0-9018-bdf866f51624-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.149436 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5ksd\" (UniqueName: \"kubernetes.io/projected/e7f5af38-731d-4da0-9018-bdf866f51624-kube-api-access-b5ksd\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.149448 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.149462 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7f5af38-731d-4da0-9018-bdf866f51624-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.627913 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f5af38-731d-4da0-9018-bdf866f51624" path="/var/lib/kubelet/pods/e7f5af38-731d-4da0-9018-bdf866f51624/volumes" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.687318 4831 scope.go:117] "RemoveContainer" containerID="9e0b9594ea256e33c76bee00bf5c7bf1f4a65a12081795538757fb2e2fdd2433" Mar 09 16:39:59 crc kubenswrapper[4831]: I0309 16:39:59.687581 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hxz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.096043 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c68vc"] Mar 09 16:40:00 crc kubenswrapper[4831]: E0309 16:40:00.096447 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f5af38-731d-4da0-9018-bdf866f51624" containerName="swift-ring-rebalance" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.096458 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f5af38-731d-4da0-9018-bdf866f51624" containerName="swift-ring-rebalance" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.096639 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f5af38-731d-4da0-9018-bdf866f51624" containerName="swift-ring-rebalance" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.097205 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.099956 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.101076 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.102437 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c68vc"] Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.144003 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551240-s2zfz"] Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.145187 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.147380 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.148575 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.148770 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.163271 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551240-s2zfz"] Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170545 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59znd\" (UniqueName: \"kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170724 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.170774 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.272992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtclp\" (UniqueName: \"kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp\") pod \"auto-csr-approver-29551240-s2zfz\" (UID: \"8c0ffa7f-eeb8-489b-a60a-b04b1d731453\") " pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273069 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273239 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59znd\" (UniqueName: \"kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.273641 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.274280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.274280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.278456 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.288071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.291885 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59znd\" (UniqueName: \"kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd\") pod \"swift-ring-rebalance-debug-c68vc\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.374532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtclp\" (UniqueName: \"kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp\") pod \"auto-csr-approver-29551240-s2zfz\" (UID: \"8c0ffa7f-eeb8-489b-a60a-b04b1d731453\") " pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.390927 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtclp\" (UniqueName: \"kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp\") pod \"auto-csr-approver-29551240-s2zfz\" (UID: \"8c0ffa7f-eeb8-489b-a60a-b04b1d731453\") " pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.414365 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.462445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.862512 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c68vc"] Mar 09 16:40:00 crc kubenswrapper[4831]: W0309 16:40:00.868671 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15fc6658_ad64_4026_924b_b6955cc24a5a.slice/crio-a367ec41cc6b7260bdcca25d86b0e0bbd8b5c09f349dd8d555106528362b2193 WatchSource:0}: Error finding container a367ec41cc6b7260bdcca25d86b0e0bbd8b5c09f349dd8d555106528362b2193: Status 404 returned error can't find the container with id a367ec41cc6b7260bdcca25d86b0e0bbd8b5c09f349dd8d555106528362b2193 Mar 09 16:40:00 crc kubenswrapper[4831]: I0309 16:40:00.930135 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551240-s2zfz"] Mar 09 16:40:00 crc kubenswrapper[4831]: W0309 16:40:00.938892 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c0ffa7f_eeb8_489b_a60a_b04b1d731453.slice/crio-33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2 WatchSource:0}: Error finding container 33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2: Status 404 returned error can't find the container with id 33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2 Mar 09 16:40:01 crc kubenswrapper[4831]: I0309 16:40:01.618264 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:40:01 crc kubenswrapper[4831]: E0309 16:40:01.618765 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:40:01 crc kubenswrapper[4831]: I0309 16:40:01.708195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" event={"ID":"8c0ffa7f-eeb8-489b-a60a-b04b1d731453","Type":"ContainerStarted","Data":"33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2"} Mar 09 16:40:01 crc kubenswrapper[4831]: I0309 16:40:01.710282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" event={"ID":"15fc6658-ad64-4026-924b-b6955cc24a5a","Type":"ContainerStarted","Data":"f8f97e2dc9aab56a1f40274d59cd4bcb1e0c2300fb978718031debf4e5c9980f"} Mar 09 16:40:01 crc kubenswrapper[4831]: I0309 16:40:01.710312 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" event={"ID":"15fc6658-ad64-4026-924b-b6955cc24a5a","Type":"ContainerStarted","Data":"a367ec41cc6b7260bdcca25d86b0e0bbd8b5c09f349dd8d555106528362b2193"} Mar 09 16:40:01 crc kubenswrapper[4831]: I0309 16:40:01.730893 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" podStartSLOduration=1.7308698599999999 podStartE2EDuration="1.73086986s" podCreationTimestamp="2026-03-09 16:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:01.724482828 +0000 UTC m=+2528.858165271" watchObservedRunningTime="2026-03-09 16:40:01.73086986 +0000 UTC m=+2528.864552323" Mar 09 16:40:02 crc kubenswrapper[4831]: I0309 16:40:02.720187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" event={"ID":"8c0ffa7f-eeb8-489b-a60a-b04b1d731453","Type":"ContainerStarted","Data":"882fa21516b5ae75d24ca2b43c779677678200693f5ed0cad8396f1062c7d08e"} Mar 09 16:40:02 crc kubenswrapper[4831]: I0309 16:40:02.721938 4831 generic.go:334] "Generic (PLEG): container finished" podID="15fc6658-ad64-4026-924b-b6955cc24a5a" containerID="f8f97e2dc9aab56a1f40274d59cd4bcb1e0c2300fb978718031debf4e5c9980f" exitCode=0 Mar 09 16:40:02 crc kubenswrapper[4831]: I0309 16:40:02.721964 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" event={"ID":"15fc6658-ad64-4026-924b-b6955cc24a5a","Type":"ContainerDied","Data":"f8f97e2dc9aab56a1f40274d59cd4bcb1e0c2300fb978718031debf4e5c9980f"} Mar 09 16:40:02 crc kubenswrapper[4831]: I0309 16:40:02.734539 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" podStartSLOduration=1.368896791 podStartE2EDuration="2.734519262s" podCreationTimestamp="2026-03-09 16:40:00 +0000 UTC" firstStartedPulling="2026-03-09 16:40:00.94125608 +0000 UTC m=+2528.074938493" lastFinishedPulling="2026-03-09 16:40:02.306878541 +0000 UTC m=+2529.440560964" observedRunningTime="2026-03-09 16:40:02.732109154 +0000 UTC m=+2529.865791577" watchObservedRunningTime="2026-03-09 16:40:02.734519262 +0000 UTC m=+2529.868201685" Mar 09 16:40:03 crc kubenswrapper[4831]: I0309 16:40:03.737226 4831 generic.go:334] "Generic (PLEG): container finished" podID="8c0ffa7f-eeb8-489b-a60a-b04b1d731453" containerID="882fa21516b5ae75d24ca2b43c779677678200693f5ed0cad8396f1062c7d08e" exitCode=0 Mar 09 16:40:03 crc kubenswrapper[4831]: I0309 16:40:03.737275 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" event={"ID":"8c0ffa7f-eeb8-489b-a60a-b04b1d731453","Type":"ContainerDied","Data":"882fa21516b5ae75d24ca2b43c779677678200693f5ed0cad8396f1062c7d08e"} Mar 09 16:40:03 crc kubenswrapper[4831]: I0309 16:40:03.998638 4831 scope.go:117] "RemoveContainer" containerID="7c6ad37776926321a3ce461804570fde2c62a843b8f2efd6e2b64e17ea02e6ac" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.053129 4831 scope.go:117] "RemoveContainer" containerID="b378cbfeaecd1a93f2fd472dc8240e300d82616d13167ed7330edf8618c2dcd7" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.065518 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.093066 4831 scope.go:117] "RemoveContainer" containerID="dc5bb4e00be4dad591defb87d2e267fae0a485190618ba0f4303104824291901" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.102569 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c68vc"] Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.109391 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c68vc"] Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.124982 4831 scope.go:117] "RemoveContainer" containerID="6961fb2f08bddb437a7f685cd7a99515f7d506f8eb6b52b66c2c737f3183c7dd" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.129795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59znd\" (UniqueName: \"kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.129903 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.129935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.129956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.130021 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.130054 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift\") pod \"15fc6658-ad64-4026-924b-b6955cc24a5a\" (UID: \"15fc6658-ad64-4026-924b-b6955cc24a5a\") " Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.131695 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.132656 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.136772 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd" (OuterVolumeSpecName: "kube-api-access-59znd") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "kube-api-access-59znd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.153082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts" (OuterVolumeSpecName: "scripts") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.154172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.154319 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "15fc6658-ad64-4026-924b-b6955cc24a5a" (UID: "15fc6658-ad64-4026-924b-b6955cc24a5a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.162261 4831 scope.go:117] "RemoveContainer" containerID="85bd876b2996bedfbc2b8b48bcd7c3b9ccdbdf427146592695257ad99f2a5a11" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.184758 4831 scope.go:117] "RemoveContainer" containerID="050c2d236f0a93bfd000bba6b032d817dfcdef47ef4ab46c04a808b682ab8bde" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.222721 4831 scope.go:117] "RemoveContainer" containerID="b555c2a4360a825c5752c2c1a35cea62cd1a5cbb379bdbd2b9d65525ceca189b" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231798 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59znd\" (UniqueName: \"kubernetes.io/projected/15fc6658-ad64-4026-924b-b6955cc24a5a-kube-api-access-59znd\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231828 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231839 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15fc6658-ad64-4026-924b-b6955cc24a5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231847 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231890 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15fc6658-ad64-4026-924b-b6955cc24a5a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.231899 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15fc6658-ad64-4026-924b-b6955cc24a5a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.255986 4831 scope.go:117] "RemoveContainer" containerID="f041a1e2578461b56c74618911f354010bb1f6a65b2cf1cf4b08d2b0bb974567" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.748493 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c68vc" Mar 09 16:40:04 crc kubenswrapper[4831]: I0309 16:40:04.748514 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a367ec41cc6b7260bdcca25d86b0e0bbd8b5c09f349dd8d555106528362b2193" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.037996 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.144721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtclp\" (UniqueName: \"kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp\") pod \"8c0ffa7f-eeb8-489b-a60a-b04b1d731453\" (UID: \"8c0ffa7f-eeb8-489b-a60a-b04b1d731453\") " Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.151245 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp" (OuterVolumeSpecName: "kube-api-access-mtclp") pod "8c0ffa7f-eeb8-489b-a60a-b04b1d731453" (UID: "8c0ffa7f-eeb8-489b-a60a-b04b1d731453"). InnerVolumeSpecName "kube-api-access-mtclp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.243233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4"] Mar 09 16:40:05 crc kubenswrapper[4831]: E0309 16:40:05.243690 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fc6658-ad64-4026-924b-b6955cc24a5a" containerName="swift-ring-rebalance" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.243718 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fc6658-ad64-4026-924b-b6955cc24a5a" containerName="swift-ring-rebalance" Mar 09 16:40:05 crc kubenswrapper[4831]: E0309 16:40:05.243752 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0ffa7f-eeb8-489b-a60a-b04b1d731453" containerName="oc" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.243762 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0ffa7f-eeb8-489b-a60a-b04b1d731453" containerName="oc" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.243947 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fc6658-ad64-4026-924b-b6955cc24a5a" containerName="swift-ring-rebalance" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.243985 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0ffa7f-eeb8-489b-a60a-b04b1d731453" containerName="oc" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.244488 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.246358 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtclp\" (UniqueName: \"kubernetes.io/projected/8c0ffa7f-eeb8-489b-a60a-b04b1d731453-kube-api-access-mtclp\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.249708 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4"] Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.249741 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.250987 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347767 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4dr5\" (UniqueName: \"kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.347979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450121 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450141 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4dr5\" (UniqueName: \"kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450233 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.450517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.451320 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.451549 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.453681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.453724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.475573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4dr5\" (UniqueName: \"kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5\") pod \"swift-ring-rebalance-debug-zkjt4\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.568597 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.626888 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fc6658-ad64-4026-924b-b6955cc24a5a" path="/var/lib/kubelet/pods/15fc6658-ad64-4026-924b-b6955cc24a5a/volumes" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.760890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" event={"ID":"8c0ffa7f-eeb8-489b-a60a-b04b1d731453","Type":"ContainerDied","Data":"33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2"} Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.760943 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33814a0e685066c7272c830e19fc5f10984fc503c2f8e099f1afbdbfe911c9f2" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.761009 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551240-s2zfz" Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.807661 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551234-nm6sk"] Mar 09 16:40:05 crc kubenswrapper[4831]: I0309 16:40:05.815875 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551234-nm6sk"] Mar 09 16:40:06 crc kubenswrapper[4831]: I0309 16:40:06.070831 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4"] Mar 09 16:40:06 crc kubenswrapper[4831]: I0309 16:40:06.774659 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" event={"ID":"42f3d44e-496c-455a-b624-88e153af1b0a","Type":"ContainerStarted","Data":"acba0b1763abc14cfa4b674c9e369750233c0f3db0c441102c39b8d3d12665cd"} Mar 09 16:40:06 crc kubenswrapper[4831]: I0309 16:40:06.776510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" event={"ID":"42f3d44e-496c-455a-b624-88e153af1b0a","Type":"ContainerStarted","Data":"bb7b2bad11ad7f056f84020af328d6a85570e16185b762f1765f9e5f2bab2006"} Mar 09 16:40:06 crc kubenswrapper[4831]: I0309 16:40:06.797746 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" podStartSLOduration=1.797726835 podStartE2EDuration="1.797726835s" podCreationTimestamp="2026-03-09 16:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:06.79193593 +0000 UTC m=+2533.925618353" watchObservedRunningTime="2026-03-09 16:40:06.797726835 +0000 UTC m=+2533.931409268" Mar 09 16:40:07 crc kubenswrapper[4831]: I0309 16:40:07.630992 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4074f3-024f-4d63-a9d7-9fe5893c42d8" path="/var/lib/kubelet/pods/5c4074f3-024f-4d63-a9d7-9fe5893c42d8/volumes" Mar 09 16:40:07 crc kubenswrapper[4831]: I0309 16:40:07.783897 4831 generic.go:334] "Generic (PLEG): container finished" podID="42f3d44e-496c-455a-b624-88e153af1b0a" containerID="acba0b1763abc14cfa4b674c9e369750233c0f3db0c441102c39b8d3d12665cd" exitCode=0 Mar 09 16:40:07 crc kubenswrapper[4831]: I0309 16:40:07.783965 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" event={"ID":"42f3d44e-496c-455a-b624-88e153af1b0a","Type":"ContainerDied","Data":"acba0b1763abc14cfa4b674c9e369750233c0f3db0c441102c39b8d3d12665cd"} Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.063754 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.115317 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4"] Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.121026 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4"] Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4dr5\" (UniqueName: \"kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204771 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.204841 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf\") pod \"42f3d44e-496c-455a-b624-88e153af1b0a\" (UID: \"42f3d44e-496c-455a-b624-88e153af1b0a\") " Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.205313 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.205488 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.211119 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5" (OuterVolumeSpecName: "kube-api-access-r4dr5") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "kube-api-access-r4dr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.227172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.227514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts" (OuterVolumeSpecName: "scripts") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.227656 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "42f3d44e-496c-455a-b624-88e153af1b0a" (UID: "42f3d44e-496c-455a-b624-88e153af1b0a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306338 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42f3d44e-496c-455a-b624-88e153af1b0a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306370 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4dr5\" (UniqueName: \"kubernetes.io/projected/42f3d44e-496c-455a-b624-88e153af1b0a-kube-api-access-r4dr5\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306383 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306411 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306426 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42f3d44e-496c-455a-b624-88e153af1b0a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.306445 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42f3d44e-496c-455a-b624-88e153af1b0a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.626877 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f3d44e-496c-455a-b624-88e153af1b0a" path="/var/lib/kubelet/pods/42f3d44e-496c-455a-b624-88e153af1b0a/volumes" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.803097 4831 scope.go:117] "RemoveContainer" containerID="acba0b1763abc14cfa4b674c9e369750233c0f3db0c441102c39b8d3d12665cd" Mar 09 16:40:09 crc kubenswrapper[4831]: I0309 16:40:09.803149 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkjt4" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.259552 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm"] Mar 09 16:40:10 crc kubenswrapper[4831]: E0309 16:40:10.259867 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f3d44e-496c-455a-b624-88e153af1b0a" containerName="swift-ring-rebalance" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.259885 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f3d44e-496c-455a-b624-88e153af1b0a" containerName="swift-ring-rebalance" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.260046 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42f3d44e-496c-455a-b624-88e153af1b0a" containerName="swift-ring-rebalance" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.260636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.263558 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.264060 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.268719 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm"] Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425764 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425805 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425852 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdcf\" (UniqueName: \"kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.425894 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.526866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.526948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.527011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.527059 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.527090 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdcf\" (UniqueName: \"kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.527148 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.527753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.528204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.528379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.531224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.531626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.544983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdcf\" (UniqueName: \"kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf\") pod \"swift-ring-rebalance-debug-qtmxm\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:10 crc kubenswrapper[4831]: I0309 16:40:10.593752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:11 crc kubenswrapper[4831]: W0309 16:40:11.051230 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf339fde6_64a1_49ef_8769_f6c03bc1e04a.slice/crio-2b303d1c986c237832b88445577d73e76318cbecf49e3434acae05cf69b010e8 WatchSource:0}: Error finding container 2b303d1c986c237832b88445577d73e76318cbecf49e3434acae05cf69b010e8: Status 404 returned error can't find the container with id 2b303d1c986c237832b88445577d73e76318cbecf49e3434acae05cf69b010e8 Mar 09 16:40:11 crc kubenswrapper[4831]: I0309 16:40:11.051532 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm"] Mar 09 16:40:11 crc kubenswrapper[4831]: I0309 16:40:11.825832 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" event={"ID":"f339fde6-64a1-49ef-8769-f6c03bc1e04a","Type":"ContainerStarted","Data":"bcae82598dbbc530a18515b2e2c5961e10856de607ba63e57c569400c3077f91"} Mar 09 16:40:11 crc kubenswrapper[4831]: I0309 16:40:11.826161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" event={"ID":"f339fde6-64a1-49ef-8769-f6c03bc1e04a","Type":"ContainerStarted","Data":"2b303d1c986c237832b88445577d73e76318cbecf49e3434acae05cf69b010e8"} Mar 09 16:40:11 crc kubenswrapper[4831]: I0309 16:40:11.844873 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" podStartSLOduration=1.844850186 podStartE2EDuration="1.844850186s" podCreationTimestamp="2026-03-09 16:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:11.84426531 +0000 UTC m=+2538.977947753" watchObservedRunningTime="2026-03-09 16:40:11.844850186 +0000 UTC m=+2538.978532629" Mar 09 16:40:12 crc kubenswrapper[4831]: I0309 16:40:12.617274 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:40:12 crc kubenswrapper[4831]: E0309 16:40:12.617505 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:40:12 crc kubenswrapper[4831]: I0309 16:40:12.835163 4831 generic.go:334] "Generic (PLEG): container finished" podID="f339fde6-64a1-49ef-8769-f6c03bc1e04a" containerID="bcae82598dbbc530a18515b2e2c5961e10856de607ba63e57c569400c3077f91" exitCode=0 Mar 09 16:40:12 crc kubenswrapper[4831]: I0309 16:40:12.835215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" event={"ID":"f339fde6-64a1-49ef-8769-f6c03bc1e04a","Type":"ContainerDied","Data":"bcae82598dbbc530a18515b2e2c5961e10856de607ba63e57c569400c3077f91"} Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.128935 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.184943 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm"] Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.192599 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm"] Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279385 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279607 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdcf\" (UniqueName: \"kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279678 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.279696 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf\") pod \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\" (UID: \"f339fde6-64a1-49ef-8769-f6c03bc1e04a\") " Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.281428 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.281777 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.291568 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf" (OuterVolumeSpecName: "kube-api-access-mqdcf") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "kube-api-access-mqdcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.302305 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts" (OuterVolumeSpecName: "scripts") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.306447 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.311042 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f339fde6-64a1-49ef-8769-f6c03bc1e04a" (UID: "f339fde6-64a1-49ef-8769-f6c03bc1e04a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381490 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381530 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f339fde6-64a1-49ef-8769-f6c03bc1e04a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381542 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdcf\" (UniqueName: \"kubernetes.io/projected/f339fde6-64a1-49ef-8769-f6c03bc1e04a-kube-api-access-mqdcf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381550 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381560 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f339fde6-64a1-49ef-8769-f6c03bc1e04a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.381570 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f339fde6-64a1-49ef-8769-f6c03bc1e04a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.858785 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b303d1c986c237832b88445577d73e76318cbecf49e3434acae05cf69b010e8" Mar 09 16:40:14 crc kubenswrapper[4831]: I0309 16:40:14.858912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtmxm" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.341683 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k"] Mar 09 16:40:15 crc kubenswrapper[4831]: E0309 16:40:15.342049 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f339fde6-64a1-49ef-8769-f6c03bc1e04a" containerName="swift-ring-rebalance" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.342066 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f339fde6-64a1-49ef-8769-f6c03bc1e04a" containerName="swift-ring-rebalance" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.342280 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f339fde6-64a1-49ef-8769-f6c03bc1e04a" containerName="swift-ring-rebalance" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.342942 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.349113 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.349271 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.366616 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k"] Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496355 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496494 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496520 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5twx\" (UniqueName: \"kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.496807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598461 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598594 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598676 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5twx\" (UniqueName: \"kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.598872 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.599331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.599370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.599504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.603373 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.603786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.627773 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f339fde6-64a1-49ef-8769-f6c03bc1e04a" path="/var/lib/kubelet/pods/f339fde6-64a1-49ef-8769-f6c03bc1e04a/volumes" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.633330 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5twx\" (UniqueName: \"kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx\") pod \"swift-ring-rebalance-debug-7gt9k\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:15 crc kubenswrapper[4831]: I0309 16:40:15.683253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:16 crc kubenswrapper[4831]: I0309 16:40:16.083899 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k"] Mar 09 16:40:16 crc kubenswrapper[4831]: W0309 16:40:16.088099 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7637a02_6685_4517_b638_9de30128c54f.slice/crio-175f608e06fe06c42b110205f4cb8122eee3687f5ea1acfafb023d6eb9f5d32c WatchSource:0}: Error finding container 175f608e06fe06c42b110205f4cb8122eee3687f5ea1acfafb023d6eb9f5d32c: Status 404 returned error can't find the container with id 175f608e06fe06c42b110205f4cb8122eee3687f5ea1acfafb023d6eb9f5d32c Mar 09 16:40:16 crc kubenswrapper[4831]: I0309 16:40:16.878719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" event={"ID":"a7637a02-6685-4517-b638-9de30128c54f","Type":"ContainerStarted","Data":"17e90d958e1e931dec95b8696e1eb7b276dc7804bff98635353878da18178589"} Mar 09 16:40:16 crc kubenswrapper[4831]: I0309 16:40:16.879059 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" event={"ID":"a7637a02-6685-4517-b638-9de30128c54f","Type":"ContainerStarted","Data":"175f608e06fe06c42b110205f4cb8122eee3687f5ea1acfafb023d6eb9f5d32c"} Mar 09 16:40:16 crc kubenswrapper[4831]: I0309 16:40:16.900837 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" podStartSLOduration=1.9008129299999998 podStartE2EDuration="1.90081293s" podCreationTimestamp="2026-03-09 16:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:16.895094277 +0000 UTC m=+2544.028776700" watchObservedRunningTime="2026-03-09 16:40:16.90081293 +0000 UTC m=+2544.034495353" Mar 09 16:40:17 crc kubenswrapper[4831]: I0309 16:40:17.887461 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7637a02-6685-4517-b638-9de30128c54f" containerID="17e90d958e1e931dec95b8696e1eb7b276dc7804bff98635353878da18178589" exitCode=0 Mar 09 16:40:17 crc kubenswrapper[4831]: I0309 16:40:17.887520 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" event={"ID":"a7637a02-6685-4517-b638-9de30128c54f","Type":"ContainerDied","Data":"17e90d958e1e931dec95b8696e1eb7b276dc7804bff98635353878da18178589"} Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.158382 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.188905 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k"] Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.196291 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k"] Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.352953 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.353039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.353085 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.353112 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.353178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5twx\" (UniqueName: \"kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.353194 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts\") pod \"a7637a02-6685-4517-b638-9de30128c54f\" (UID: \"a7637a02-6685-4517-b638-9de30128c54f\") " Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.354165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.354540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.360582 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx" (OuterVolumeSpecName: "kube-api-access-w5twx") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "kube-api-access-w5twx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.374283 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts" (OuterVolumeSpecName: "scripts") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.378492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.386269 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a7637a02-6685-4517-b638-9de30128c54f" (UID: "a7637a02-6685-4517-b638-9de30128c54f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460168 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460215 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7637a02-6685-4517-b638-9de30128c54f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460226 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7637a02-6685-4517-b638-9de30128c54f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460234 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460243 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5twx\" (UniqueName: \"kubernetes.io/projected/a7637a02-6685-4517-b638-9de30128c54f-kube-api-access-w5twx\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.460255 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7637a02-6685-4517-b638-9de30128c54f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.627311 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7637a02-6685-4517-b638-9de30128c54f" path="/var/lib/kubelet/pods/a7637a02-6685-4517-b638-9de30128c54f/volumes" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.907467 4831 scope.go:117] "RemoveContainer" containerID="17e90d958e1e931dec95b8696e1eb7b276dc7804bff98635353878da18178589" Mar 09 16:40:19 crc kubenswrapper[4831]: I0309 16:40:19.907522 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7gt9k" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.324430 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv"] Mar 09 16:40:20 crc kubenswrapper[4831]: E0309 16:40:20.324743 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7637a02-6685-4517-b638-9de30128c54f" containerName="swift-ring-rebalance" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.324760 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7637a02-6685-4517-b638-9de30128c54f" containerName="swift-ring-rebalance" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.324952 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7637a02-6685-4517-b638-9de30128c54f" containerName="swift-ring-rebalance" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.325442 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.327384 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.327441 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.338502 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv"] Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nsv\" (UniqueName: \"kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.373553 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474525 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474585 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nsv\" (UniqueName: \"kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474626 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474690 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.474710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.475259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.475662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.475701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.478879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.489379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.489786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nsv\" (UniqueName: \"kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv\") pod \"swift-ring-rebalance-debug-xr2rv\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:20 crc kubenswrapper[4831]: I0309 16:40:20.640536 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:21 crc kubenswrapper[4831]: I0309 16:40:21.084010 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv"] Mar 09 16:40:21 crc kubenswrapper[4831]: I0309 16:40:21.926227 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" event={"ID":"b7f51670-c481-4edc-adde-66030ddd2773","Type":"ContainerStarted","Data":"5ae8170330c8be2b1ef0c3df06a573932bf6a2598cb7c3978c28997d29e0ed07"} Mar 09 16:40:21 crc kubenswrapper[4831]: I0309 16:40:21.926270 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" event={"ID":"b7f51670-c481-4edc-adde-66030ddd2773","Type":"ContainerStarted","Data":"7ee2a2cab3ca69d8440842ce198e58db7a4b6b0c1222e2bf6e3b7e75f269e1ee"} Mar 09 16:40:21 crc kubenswrapper[4831]: I0309 16:40:21.958290 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" podStartSLOduration=1.958272266 podStartE2EDuration="1.958272266s" podCreationTimestamp="2026-03-09 16:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:21.955047084 +0000 UTC m=+2549.088729507" watchObservedRunningTime="2026-03-09 16:40:21.958272266 +0000 UTC m=+2549.091954689" Mar 09 16:40:22 crc kubenswrapper[4831]: I0309 16:40:22.936698 4831 generic.go:334] "Generic (PLEG): container finished" podID="b7f51670-c481-4edc-adde-66030ddd2773" containerID="5ae8170330c8be2b1ef0c3df06a573932bf6a2598cb7c3978c28997d29e0ed07" exitCode=0 Mar 09 16:40:22 crc kubenswrapper[4831]: I0309 16:40:22.936750 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" event={"ID":"b7f51670-c481-4edc-adde-66030ddd2773","Type":"ContainerDied","Data":"5ae8170330c8be2b1ef0c3df06a573932bf6a2598cb7c3978c28997d29e0ed07"} Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.216551 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.246914 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv"] Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.253757 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv"] Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.328845 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.328890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.328942 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.329000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.329041 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2nsv\" (UniqueName: \"kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.329122 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift\") pod \"b7f51670-c481-4edc-adde-66030ddd2773\" (UID: \"b7f51670-c481-4edc-adde-66030ddd2773\") " Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.330511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.330610 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.344593 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv" (OuterVolumeSpecName: "kube-api-access-b2nsv") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "kube-api-access-b2nsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.351923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.352391 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts" (OuterVolumeSpecName: "scripts") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.361417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7f51670-c481-4edc-adde-66030ddd2773" (UID: "b7f51670-c481-4edc-adde-66030ddd2773"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.431846 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7f51670-c481-4edc-adde-66030ddd2773-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.432148 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.432168 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7f51670-c481-4edc-adde-66030ddd2773-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.432182 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.432194 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7f51670-c481-4edc-adde-66030ddd2773-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.432205 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2nsv\" (UniqueName: \"kubernetes.io/projected/b7f51670-c481-4edc-adde-66030ddd2773-kube-api-access-b2nsv\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.617232 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:40:24 crc kubenswrapper[4831]: E0309 16:40:24.617732 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.958952 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee2a2cab3ca69d8440842ce198e58db7a4b6b0c1222e2bf6e3b7e75f269e1ee" Mar 09 16:40:24 crc kubenswrapper[4831]: I0309 16:40:24.959041 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xr2rv" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.375853 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x"] Mar 09 16:40:25 crc kubenswrapper[4831]: E0309 16:40:25.376192 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f51670-c481-4edc-adde-66030ddd2773" containerName="swift-ring-rebalance" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.376208 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f51670-c481-4edc-adde-66030ddd2773" containerName="swift-ring-rebalance" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.376433 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f51670-c481-4edc-adde-66030ddd2773" containerName="swift-ring-rebalance" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.376981 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.379109 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.386602 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.410638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x"] Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549089 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfb4\" (UniqueName: \"kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549497 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.549619 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.630010 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f51670-c481-4edc-adde-66030ddd2773" path="/var/lib/kubelet/pods/b7f51670-c481-4edc-adde-66030ddd2773/volumes" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.651833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.652105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.652293 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.652445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.652507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.652598 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrfb4\" (UniqueName: \"kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.654290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.655551 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.656585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.661394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.673580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.690215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrfb4\" (UniqueName: \"kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4\") pod \"swift-ring-rebalance-debug-qsg8x\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:25 crc kubenswrapper[4831]: I0309 16:40:25.699206 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:26 crc kubenswrapper[4831]: I0309 16:40:26.102423 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x"] Mar 09 16:40:26 crc kubenswrapper[4831]: W0309 16:40:26.107387 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod213598e0_b50c_4c53_af3b_f1d959fcc976.slice/crio-b11040a55ee535cab3f58a2479d6ff8d48a09c77dc8ba65347afa6dddd856223 WatchSource:0}: Error finding container b11040a55ee535cab3f58a2479d6ff8d48a09c77dc8ba65347afa6dddd856223: Status 404 returned error can't find the container with id b11040a55ee535cab3f58a2479d6ff8d48a09c77dc8ba65347afa6dddd856223 Mar 09 16:40:26 crc kubenswrapper[4831]: I0309 16:40:26.976457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" event={"ID":"213598e0-b50c-4c53-af3b-f1d959fcc976","Type":"ContainerStarted","Data":"edd756a4c3adfc4f43bdbb612eb8c8278adf6b3a565982a5f593e7e2433b5e4c"} Mar 09 16:40:26 crc kubenswrapper[4831]: I0309 16:40:26.976841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" event={"ID":"213598e0-b50c-4c53-af3b-f1d959fcc976","Type":"ContainerStarted","Data":"b11040a55ee535cab3f58a2479d6ff8d48a09c77dc8ba65347afa6dddd856223"} Mar 09 16:40:26 crc kubenswrapper[4831]: I0309 16:40:26.992416 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" podStartSLOduration=1.992384119 podStartE2EDuration="1.992384119s" podCreationTimestamp="2026-03-09 16:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:26.990455304 +0000 UTC m=+2554.124137747" watchObservedRunningTime="2026-03-09 16:40:26.992384119 +0000 UTC m=+2554.126066542" Mar 09 16:40:27 crc kubenswrapper[4831]: I0309 16:40:27.988792 4831 generic.go:334] "Generic (PLEG): container finished" podID="213598e0-b50c-4c53-af3b-f1d959fcc976" containerID="edd756a4c3adfc4f43bdbb612eb8c8278adf6b3a565982a5f593e7e2433b5e4c" exitCode=0 Mar 09 16:40:27 crc kubenswrapper[4831]: I0309 16:40:27.988835 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" event={"ID":"213598e0-b50c-4c53-af3b-f1d959fcc976","Type":"ContainerDied","Data":"edd756a4c3adfc4f43bdbb612eb8c8278adf6b3a565982a5f593e7e2433b5e4c"} Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.374687 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.405379 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x"] Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.418702 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x"] Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.517905 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.517955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.517996 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.518040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.518066 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrfb4\" (UniqueName: \"kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.518115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts\") pod \"213598e0-b50c-4c53-af3b-f1d959fcc976\" (UID: \"213598e0-b50c-4c53-af3b-f1d959fcc976\") " Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.519291 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.520026 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.537114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4" (OuterVolumeSpecName: "kube-api-access-hrfb4") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "kube-api-access-hrfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.540007 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts" (OuterVolumeSpecName: "scripts") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.544118 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.560710 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "213598e0-b50c-4c53-af3b-f1d959fcc976" (UID: "213598e0-b50c-4c53-af3b-f1d959fcc976"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.634570 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213598e0-b50c-4c53-af3b-f1d959fcc976" path="/var/lib/kubelet/pods/213598e0-b50c-4c53-af3b-f1d959fcc976/volumes" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636613 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636652 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/213598e0-b50c-4c53-af3b-f1d959fcc976-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636667 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/213598e0-b50c-4c53-af3b-f1d959fcc976-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636695 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636717 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrfb4\" (UniqueName: \"kubernetes.io/projected/213598e0-b50c-4c53-af3b-f1d959fcc976-kube-api-access-hrfb4\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:29 crc kubenswrapper[4831]: I0309 16:40:29.636734 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/213598e0-b50c-4c53-af3b-f1d959fcc976-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.010544 4831 scope.go:117] "RemoveContainer" containerID="edd756a4c3adfc4f43bdbb612eb8c8278adf6b3a565982a5f593e7e2433b5e4c" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.010884 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qsg8x" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.550316 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7kf27"] Mar 09 16:40:30 crc kubenswrapper[4831]: E0309 16:40:30.550646 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213598e0-b50c-4c53-af3b-f1d959fcc976" containerName="swift-ring-rebalance" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.550662 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="213598e0-b50c-4c53-af3b-f1d959fcc976" containerName="swift-ring-rebalance" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.550880 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="213598e0-b50c-4c53-af3b-f1d959fcc976" containerName="swift-ring-rebalance" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.551363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.554045 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.554045 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.568013 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7kf27"] Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94z9m\" (UniqueName: \"kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652311 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652360 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.652419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.754046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.754147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94z9m\" (UniqueName: \"kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.754186 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.755208 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.755579 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.755606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.755643 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.756273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.757167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.760260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.761007 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.781061 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94z9m\" (UniqueName: \"kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m\") pod \"swift-ring-rebalance-debug-7kf27\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:30 crc kubenswrapper[4831]: I0309 16:40:30.870185 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:31 crc kubenswrapper[4831]: I0309 16:40:31.063664 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7kf27"] Mar 09 16:40:31 crc kubenswrapper[4831]: W0309 16:40:31.071148 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd6fbc74_9ad6_4a0b_883c_6bce78e27a0b.slice/crio-b74f416cd536aaa5ec8102ab7bfced4d75ad506f7214bead6554c7f7a064bca4 WatchSource:0}: Error finding container b74f416cd536aaa5ec8102ab7bfced4d75ad506f7214bead6554c7f7a064bca4: Status 404 returned error can't find the container with id b74f416cd536aaa5ec8102ab7bfced4d75ad506f7214bead6554c7f7a064bca4 Mar 09 16:40:32 crc kubenswrapper[4831]: I0309 16:40:32.038475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" event={"ID":"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b","Type":"ContainerStarted","Data":"4549385f2b07202779fb3d5fcb640fdcf26deddec63882783561bbdecafbc192"} Mar 09 16:40:32 crc kubenswrapper[4831]: I0309 16:40:32.038718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" event={"ID":"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b","Type":"ContainerStarted","Data":"b74f416cd536aaa5ec8102ab7bfced4d75ad506f7214bead6554c7f7a064bca4"} Mar 09 16:40:32 crc kubenswrapper[4831]: I0309 16:40:32.061042 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" podStartSLOduration=2.0610213059999998 podStartE2EDuration="2.061021306s" podCreationTimestamp="2026-03-09 16:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:32.054898841 +0000 UTC m=+2559.188581264" watchObservedRunningTime="2026-03-09 16:40:32.061021306 +0000 UTC m=+2559.194703729" Mar 09 16:40:33 crc kubenswrapper[4831]: I0309 16:40:33.050081 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" containerID="4549385f2b07202779fb3d5fcb640fdcf26deddec63882783561bbdecafbc192" exitCode=0 Mar 09 16:40:33 crc kubenswrapper[4831]: I0309 16:40:33.050466 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" event={"ID":"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b","Type":"ContainerDied","Data":"4549385f2b07202779fb3d5fcb640fdcf26deddec63882783561bbdecafbc192"} Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.324571 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.362160 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7kf27"] Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.369151 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7kf27"] Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.503000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.503098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.503123 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.503681 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.503955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94z9m\" (UniqueName: \"kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.504011 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.504040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf\") pod \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\" (UID: \"fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b\") " Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.504472 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.504801 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.513103 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m" (OuterVolumeSpecName: "kube-api-access-94z9m") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "kube-api-access-94z9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.530386 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.538987 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts" (OuterVolumeSpecName: "scripts") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.542210 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" (UID: "fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.606304 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94z9m\" (UniqueName: \"kubernetes.io/projected/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-kube-api-access-94z9m\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.606345 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.606358 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.606371 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:34 crc kubenswrapper[4831]: I0309 16:40:34.606383 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.070868 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b74f416cd536aaa5ec8102ab7bfced4d75ad506f7214bead6554c7f7a064bca4" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.070982 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7kf27" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.488279 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7"] Mar 09 16:40:35 crc kubenswrapper[4831]: E0309 16:40:35.488611 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" containerName="swift-ring-rebalance" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.488629 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" containerName="swift-ring-rebalance" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.488799 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" containerName="swift-ring-rebalance" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.489288 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.493489 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.493743 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.499626 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7"] Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620003 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbwn\" (UniqueName: \"kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620797 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.620981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.626947 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b" path="/var/lib/kubelet/pods/fd6fbc74-9ad6-4a0b-883c-6bce78e27a0b/volumes" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722631 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722668 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbwn\" (UniqueName: \"kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.722774 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.723794 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.723989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.724149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.726338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.731092 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.739008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbwn\" (UniqueName: \"kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn\") pod \"swift-ring-rebalance-debug-nk5b7\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:35 crc kubenswrapper[4831]: I0309 16:40:35.823689 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:36 crc kubenswrapper[4831]: I0309 16:40:36.268932 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7"] Mar 09 16:40:37 crc kubenswrapper[4831]: I0309 16:40:37.088302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" event={"ID":"d67dea01-d6b4-461f-86c4-27afb15e8b1f","Type":"ContainerStarted","Data":"133e72f5aeafc8981487ae59077b73fe5f75792e493243db5477fe2c920573a4"} Mar 09 16:40:37 crc kubenswrapper[4831]: I0309 16:40:37.088687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" event={"ID":"d67dea01-d6b4-461f-86c4-27afb15e8b1f","Type":"ContainerStarted","Data":"0753ed6c1141ac52115898efbe097d5ea3f481094f9d4447eed2de6b4f93a7bb"} Mar 09 16:40:37 crc kubenswrapper[4831]: I0309 16:40:37.108248 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" podStartSLOduration=2.108229939 podStartE2EDuration="2.108229939s" podCreationTimestamp="2026-03-09 16:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:37.102236979 +0000 UTC m=+2564.235919412" watchObservedRunningTime="2026-03-09 16:40:37.108229939 +0000 UTC m=+2564.241912362" Mar 09 16:40:38 crc kubenswrapper[4831]: I0309 16:40:38.097324 4831 generic.go:334] "Generic (PLEG): container finished" podID="d67dea01-d6b4-461f-86c4-27afb15e8b1f" containerID="133e72f5aeafc8981487ae59077b73fe5f75792e493243db5477fe2c920573a4" exitCode=0 Mar 09 16:40:38 crc kubenswrapper[4831]: I0309 16:40:38.097364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" event={"ID":"d67dea01-d6b4-461f-86c4-27afb15e8b1f","Type":"ContainerDied","Data":"133e72f5aeafc8981487ae59077b73fe5f75792e493243db5477fe2c920573a4"} Mar 09 16:40:38 crc kubenswrapper[4831]: I0309 16:40:38.617715 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:40:38 crc kubenswrapper[4831]: E0309 16:40:38.618251 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.371612 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.408258 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7"] Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.414604 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7"] Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.485625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.485693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.485825 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.486204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.486339 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwbwn\" (UniqueName: \"kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.486410 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.486442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift\") pod \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\" (UID: \"d67dea01-d6b4-461f-86c4-27afb15e8b1f\") " Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.486728 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.487302 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.492167 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn" (OuterVolumeSpecName: "kube-api-access-kwbwn") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "kube-api-access-kwbwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.508096 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts" (OuterVolumeSpecName: "scripts") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.508181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.516551 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d67dea01-d6b4-461f-86c4-27afb15e8b1f" (UID: "d67dea01-d6b4-461f-86c4-27afb15e8b1f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.588795 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d67dea01-d6b4-461f-86c4-27afb15e8b1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.588860 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwbwn\" (UniqueName: \"kubernetes.io/projected/d67dea01-d6b4-461f-86c4-27afb15e8b1f-kube-api-access-kwbwn\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.588882 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.588900 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d67dea01-d6b4-461f-86c4-27afb15e8b1f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.588917 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d67dea01-d6b4-461f-86c4-27afb15e8b1f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:39 crc kubenswrapper[4831]: I0309 16:40:39.628528 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67dea01-d6b4-461f-86c4-27afb15e8b1f" path="/var/lib/kubelet/pods/d67dea01-d6b4-461f-86c4-27afb15e8b1f/volumes" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.280101 4831 scope.go:117] "RemoveContainer" containerID="133e72f5aeafc8981487ae59077b73fe5f75792e493243db5477fe2c920573a4" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.280161 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nk5b7" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.533646 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54h79"] Mar 09 16:40:40 crc kubenswrapper[4831]: E0309 16:40:40.534005 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67dea01-d6b4-461f-86c4-27afb15e8b1f" containerName="swift-ring-rebalance" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.534022 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67dea01-d6b4-461f-86c4-27afb15e8b1f" containerName="swift-ring-rebalance" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.534229 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67dea01-d6b4-461f-86c4-27afb15e8b1f" containerName="swift-ring-rebalance" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.534817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.537308 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.537859 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.549622 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54h79"] Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682072 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682261 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682349 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hzd\" (UniqueName: \"kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.682533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785724 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hzd\" (UniqueName: \"kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.785992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.786893 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.788560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.788697 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.802041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.811256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.814984 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hzd\" (UniqueName: \"kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd\") pod \"swift-ring-rebalance-debug-54h79\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:40 crc kubenswrapper[4831]: I0309 16:40:40.850597 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:41 crc kubenswrapper[4831]: I0309 16:40:41.244880 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54h79"] Mar 09 16:40:41 crc kubenswrapper[4831]: W0309 16:40:41.251159 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod894f3708_431c_4309_83c8_bff89a9b8f54.slice/crio-bc2df700059dbb2f622eda9c0a2c0af13832176fd0dca38486299796906c0268 WatchSource:0}: Error finding container bc2df700059dbb2f622eda9c0a2c0af13832176fd0dca38486299796906c0268: Status 404 returned error can't find the container with id bc2df700059dbb2f622eda9c0a2c0af13832176fd0dca38486299796906c0268 Mar 09 16:40:41 crc kubenswrapper[4831]: I0309 16:40:41.292138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" event={"ID":"894f3708-431c-4309-83c8-bff89a9b8f54","Type":"ContainerStarted","Data":"bc2df700059dbb2f622eda9c0a2c0af13832176fd0dca38486299796906c0268"} Mar 09 16:40:42 crc kubenswrapper[4831]: I0309 16:40:42.302821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" event={"ID":"894f3708-431c-4309-83c8-bff89a9b8f54","Type":"ContainerStarted","Data":"fc409beabce962debd7389315052a4e7870b27bf28e596e850b30ef5b22cad79"} Mar 09 16:40:42 crc kubenswrapper[4831]: I0309 16:40:42.321863 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" podStartSLOduration=2.321839778 podStartE2EDuration="2.321839778s" podCreationTimestamp="2026-03-09 16:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:42.317640629 +0000 UTC m=+2569.451323072" watchObservedRunningTime="2026-03-09 16:40:42.321839778 +0000 UTC m=+2569.455522211" Mar 09 16:40:43 crc kubenswrapper[4831]: I0309 16:40:43.314484 4831 generic.go:334] "Generic (PLEG): container finished" podID="894f3708-431c-4309-83c8-bff89a9b8f54" containerID="fc409beabce962debd7389315052a4e7870b27bf28e596e850b30ef5b22cad79" exitCode=0 Mar 09 16:40:43 crc kubenswrapper[4831]: I0309 16:40:43.314719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" event={"ID":"894f3708-431c-4309-83c8-bff89a9b8f54","Type":"ContainerDied","Data":"fc409beabce962debd7389315052a4e7870b27bf28e596e850b30ef5b22cad79"} Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.650503 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.677356 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54h79"] Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.682199 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54h79"] Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741274 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741383 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741490 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8hzd\" (UniqueName: \"kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.741537 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift\") pod \"894f3708-431c-4309-83c8-bff89a9b8f54\" (UID: \"894f3708-431c-4309-83c8-bff89a9b8f54\") " Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.742358 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.742734 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.755622 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd" (OuterVolumeSpecName: "kube-api-access-f8hzd") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "kube-api-access-f8hzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.764076 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts" (OuterVolumeSpecName: "scripts") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.766589 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.766925 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "894f3708-431c-4309-83c8-bff89a9b8f54" (UID: "894f3708-431c-4309-83c8-bff89a9b8f54"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842424 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8hzd\" (UniqueName: \"kubernetes.io/projected/894f3708-431c-4309-83c8-bff89a9b8f54-kube-api-access-f8hzd\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842470 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/894f3708-431c-4309-83c8-bff89a9b8f54-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842484 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842497 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842508 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/894f3708-431c-4309-83c8-bff89a9b8f54-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:44 crc kubenswrapper[4831]: I0309 16:40:44.842518 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/894f3708-431c-4309-83c8-bff89a9b8f54-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:45 crc kubenswrapper[4831]: I0309 16:40:45.332187 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2df700059dbb2f622eda9c0a2c0af13832176fd0dca38486299796906c0268" Mar 09 16:40:45 crc kubenswrapper[4831]: I0309 16:40:45.332254 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54h79" Mar 09 16:40:45 crc kubenswrapper[4831]: I0309 16:40:45.628353 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894f3708-431c-4309-83c8-bff89a9b8f54" path="/var/lib/kubelet/pods/894f3708-431c-4309-83c8-bff89a9b8f54/volumes" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.137622 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn"] Mar 09 16:40:46 crc kubenswrapper[4831]: E0309 16:40:46.137948 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894f3708-431c-4309-83c8-bff89a9b8f54" containerName="swift-ring-rebalance" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.137963 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="894f3708-431c-4309-83c8-bff89a9b8f54" containerName="swift-ring-rebalance" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.138142 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="894f3708-431c-4309-83c8-bff89a9b8f54" containerName="swift-ring-rebalance" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.138716 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.140492 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.142766 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.148898 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn"] Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.179697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmpm\" (UniqueName: \"kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.179765 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.179838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.179882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.179906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.180018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281324 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fmpm\" (UniqueName: \"kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.281920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.282179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.282291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.291417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.291417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.297216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fmpm\" (UniqueName: \"kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm\") pod \"swift-ring-rebalance-debug-b9dnn\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.461169 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:46 crc kubenswrapper[4831]: I0309 16:40:46.915012 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn"] Mar 09 16:40:46 crc kubenswrapper[4831]: W0309 16:40:46.929752 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c156ee_af46_4caf_a7e9_f595f8d48dfd.slice/crio-485a95d37cbc464a2830d89ca5a7f4578bbe4066d1ed5deadd03db925fd5cf5b WatchSource:0}: Error finding container 485a95d37cbc464a2830d89ca5a7f4578bbe4066d1ed5deadd03db925fd5cf5b: Status 404 returned error can't find the container with id 485a95d37cbc464a2830d89ca5a7f4578bbe4066d1ed5deadd03db925fd5cf5b Mar 09 16:40:47 crc kubenswrapper[4831]: I0309 16:40:47.357057 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" event={"ID":"54c156ee-af46-4caf-a7e9-f595f8d48dfd","Type":"ContainerStarted","Data":"a15f8f9e482edf0e49c9294fb3e0e39d799cdb93fc590cc73692e1409710ce8d"} Mar 09 16:40:47 crc kubenswrapper[4831]: I0309 16:40:47.358233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" event={"ID":"54c156ee-af46-4caf-a7e9-f595f8d48dfd","Type":"ContainerStarted","Data":"485a95d37cbc464a2830d89ca5a7f4578bbe4066d1ed5deadd03db925fd5cf5b"} Mar 09 16:40:47 crc kubenswrapper[4831]: I0309 16:40:47.387624 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" podStartSLOduration=1.387606281 podStartE2EDuration="1.387606281s" podCreationTimestamp="2026-03-09 16:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:47.374293821 +0000 UTC m=+2574.507976254" watchObservedRunningTime="2026-03-09 16:40:47.387606281 +0000 UTC m=+2574.521288714" Mar 09 16:40:49 crc kubenswrapper[4831]: I0309 16:40:49.377689 4831 generic.go:334] "Generic (PLEG): container finished" podID="54c156ee-af46-4caf-a7e9-f595f8d48dfd" containerID="a15f8f9e482edf0e49c9294fb3e0e39d799cdb93fc590cc73692e1409710ce8d" exitCode=0 Mar 09 16:40:49 crc kubenswrapper[4831]: I0309 16:40:49.377789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" event={"ID":"54c156ee-af46-4caf-a7e9-f595f8d48dfd","Type":"ContainerDied","Data":"a15f8f9e482edf0e49c9294fb3e0e39d799cdb93fc590cc73692e1409710ce8d"} Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.643205 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.691952 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn"] Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.700737 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn"] Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.748901 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749046 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749146 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749175 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749243 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fmpm\" (UniqueName: \"kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm\") pod \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\" (UID: \"54c156ee-af46-4caf-a7e9-f595f8d48dfd\") " Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749847 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.749946 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.759921 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm" (OuterVolumeSpecName: "kube-api-access-8fmpm") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "kube-api-access-8fmpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.774523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.774961 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts" (OuterVolumeSpecName: "scripts") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.779336 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "54c156ee-af46-4caf-a7e9-f595f8d48dfd" (UID: "54c156ee-af46-4caf-a7e9-f595f8d48dfd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850481 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850523 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fmpm\" (UniqueName: \"kubernetes.io/projected/54c156ee-af46-4caf-a7e9-f595f8d48dfd-kube-api-access-8fmpm\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850536 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850546 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c156ee-af46-4caf-a7e9-f595f8d48dfd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850555 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/54c156ee-af46-4caf-a7e9-f595f8d48dfd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:50 crc kubenswrapper[4831]: I0309 16:40:50.850563 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/54c156ee-af46-4caf-a7e9-f595f8d48dfd-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.396556 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="485a95d37cbc464a2830d89ca5a7f4578bbe4066d1ed5deadd03db925fd5cf5b" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.396699 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9dnn" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.617743 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:40:51 crc kubenswrapper[4831]: E0309 16:40:51.617998 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.628355 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c156ee-af46-4caf-a7e9-f595f8d48dfd" path="/var/lib/kubelet/pods/54c156ee-af46-4caf-a7e9-f595f8d48dfd/volumes" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.842135 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw"] Mar 09 16:40:51 crc kubenswrapper[4831]: E0309 16:40:51.842466 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c156ee-af46-4caf-a7e9-f595f8d48dfd" containerName="swift-ring-rebalance" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.842493 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c156ee-af46-4caf-a7e9-f595f8d48dfd" containerName="swift-ring-rebalance" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.842705 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c156ee-af46-4caf-a7e9-f595f8d48dfd" containerName="swift-ring-rebalance" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.843286 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.845289 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.846119 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.856167 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw"] Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.966859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.967172 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.967195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.967234 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xspsq\" (UniqueName: \"kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.967766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:51 crc kubenswrapper[4831]: I0309 16:40:51.967859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069379 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069433 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.069463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xspsq\" (UniqueName: \"kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.070305 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.070361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.070643 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.073666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.081524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.094786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xspsq\" (UniqueName: \"kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq\") pod \"swift-ring-rebalance-debug-8s6lw\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.158892 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:52 crc kubenswrapper[4831]: I0309 16:40:52.572455 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw"] Mar 09 16:40:53 crc kubenswrapper[4831]: I0309 16:40:53.414216 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" event={"ID":"9a522046-bc41-427d-9032-3757afcd67ad","Type":"ContainerStarted","Data":"e5952d3a3a64edd3791066ece2f9d2331e4c0637feb331018b610a6febb050dc"} Mar 09 16:40:53 crc kubenswrapper[4831]: I0309 16:40:53.414541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" event={"ID":"9a522046-bc41-427d-9032-3757afcd67ad","Type":"ContainerStarted","Data":"d7d7f6694fa501ef1074caee560fd4b8550c1a9edff36e12b960fae0aee8026d"} Mar 09 16:40:53 crc kubenswrapper[4831]: I0309 16:40:53.436363 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" podStartSLOduration=2.436342007 podStartE2EDuration="2.436342007s" podCreationTimestamp="2026-03-09 16:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:53.431585921 +0000 UTC m=+2580.565268354" watchObservedRunningTime="2026-03-09 16:40:53.436342007 +0000 UTC m=+2580.570024430" Mar 09 16:40:54 crc kubenswrapper[4831]: I0309 16:40:54.423690 4831 generic.go:334] "Generic (PLEG): container finished" podID="9a522046-bc41-427d-9032-3757afcd67ad" containerID="e5952d3a3a64edd3791066ece2f9d2331e4c0637feb331018b610a6febb050dc" exitCode=0 Mar 09 16:40:54 crc kubenswrapper[4831]: I0309 16:40:54.423732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" event={"ID":"9a522046-bc41-427d-9032-3757afcd67ad","Type":"ContainerDied","Data":"e5952d3a3a64edd3791066ece2f9d2331e4c0637feb331018b610a6febb050dc"} Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.754384 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.790831 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw"] Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.797851 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw"] Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925021 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925135 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925191 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925251 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xspsq\" (UniqueName: \"kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq\") pod \"9a522046-bc41-427d-9032-3757afcd67ad\" (UID: \"9a522046-bc41-427d-9032-3757afcd67ad\") " Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.925868 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.926497 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.931237 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq" (OuterVolumeSpecName: "kube-api-access-xspsq") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "kube-api-access-xspsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.946461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts" (OuterVolumeSpecName: "scripts") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.947852 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:55 crc kubenswrapper[4831]: I0309 16:40:55.948500 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9a522046-bc41-427d-9032-3757afcd67ad" (UID: "9a522046-bc41-427d-9032-3757afcd67ad"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026809 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026841 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a522046-bc41-427d-9032-3757afcd67ad-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026852 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026861 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a522046-bc41-427d-9032-3757afcd67ad-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026871 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a522046-bc41-427d-9032-3757afcd67ad-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.026879 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xspsq\" (UniqueName: \"kubernetes.io/projected/9a522046-bc41-427d-9032-3757afcd67ad-kube-api-access-xspsq\") on node \"crc\" DevicePath \"\"" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.445036 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d7f6694fa501ef1074caee560fd4b8550c1a9edff36e12b960fae0aee8026d" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.445120 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8s6lw" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.928277 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-49tw4"] Mar 09 16:40:56 crc kubenswrapper[4831]: E0309 16:40:56.929520 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a522046-bc41-427d-9032-3757afcd67ad" containerName="swift-ring-rebalance" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.929673 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a522046-bc41-427d-9032-3757afcd67ad" containerName="swift-ring-rebalance" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.930030 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a522046-bc41-427d-9032-3757afcd67ad" containerName="swift-ring-rebalance" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.930974 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.934677 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.936545 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-49tw4"] Mar 09 16:40:56 crc kubenswrapper[4831]: I0309 16:40:56.937972 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043334 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krddg\" (UniqueName: \"kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.043589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.144767 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.144873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.144912 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.144961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.145000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.145027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krddg\" (UniqueName: \"kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.146182 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.146238 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.146434 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.149747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.150273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.183036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krddg\" (UniqueName: \"kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg\") pod \"swift-ring-rebalance-debug-49tw4\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.248131 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.627784 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a522046-bc41-427d-9032-3757afcd67ad" path="/var/lib/kubelet/pods/9a522046-bc41-427d-9032-3757afcd67ad/volumes" Mar 09 16:40:57 crc kubenswrapper[4831]: I0309 16:40:57.688903 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-49tw4"] Mar 09 16:40:58 crc kubenswrapper[4831]: I0309 16:40:58.471923 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" event={"ID":"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc","Type":"ContainerStarted","Data":"e9b2b389a9bd17a4f97a29b6537d0db0331d171ab8a64fb380987b79320915a8"} Mar 09 16:40:58 crc kubenswrapper[4831]: I0309 16:40:58.472359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" event={"ID":"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc","Type":"ContainerStarted","Data":"af32206aa441e2506567307cf4ae032fa984cf9f050709a1148233045e396df7"} Mar 09 16:40:58 crc kubenswrapper[4831]: I0309 16:40:58.498299 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" podStartSLOduration=2.49828022 podStartE2EDuration="2.49828022s" podCreationTimestamp="2026-03-09 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:40:58.492328101 +0000 UTC m=+2585.626010524" watchObservedRunningTime="2026-03-09 16:40:58.49828022 +0000 UTC m=+2585.631962643" Mar 09 16:40:59 crc kubenswrapper[4831]: I0309 16:40:59.483899 4831 generic.go:334] "Generic (PLEG): container finished" podID="1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" containerID="e9b2b389a9bd17a4f97a29b6537d0db0331d171ab8a64fb380987b79320915a8" exitCode=0 Mar 09 16:40:59 crc kubenswrapper[4831]: I0309 16:40:59.484004 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" event={"ID":"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc","Type":"ContainerDied","Data":"e9b2b389a9bd17a4f97a29b6537d0db0331d171ab8a64fb380987b79320915a8"} Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.791863 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.831809 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-49tw4"] Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.838612 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-49tw4"] Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.911965 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krddg\" (UniqueName: \"kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912048 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912142 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912208 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912242 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf\") pod \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\" (UID: \"1c4978ff-dcdf-4384-ae62-f1c9d1f928cc\") " Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.912592 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.913439 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.919201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg" (OuterVolumeSpecName: "kube-api-access-krddg") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "kube-api-access-krddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.933919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts" (OuterVolumeSpecName: "scripts") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.937136 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:00 crc kubenswrapper[4831]: I0309 16:41:00.947519 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" (UID: "1c4978ff-dcdf-4384-ae62-f1c9d1f928cc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013655 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013690 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013704 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krddg\" (UniqueName: \"kubernetes.io/projected/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-kube-api-access-krddg\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013712 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013722 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.013729 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.517518 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af32206aa441e2506567307cf4ae032fa984cf9f050709a1148233045e396df7" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.517640 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-49tw4" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.628461 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" path="/var/lib/kubelet/pods/1c4978ff-dcdf-4384-ae62-f1c9d1f928cc/volumes" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.989946 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6"] Mar 09 16:41:01 crc kubenswrapper[4831]: E0309 16:41:01.990310 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" containerName="swift-ring-rebalance" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.990324 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" containerName="swift-ring-rebalance" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.990508 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4978ff-dcdf-4384-ae62-f1c9d1f928cc" containerName="swift-ring-rebalance" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.991047 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.993358 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:01 crc kubenswrapper[4831]: I0309 16:41:01.993951 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.000432 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6"] Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.129819 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwff2\" (UniqueName: \"kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.129875 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.130034 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.130075 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.130120 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.130145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231540 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwff2\" (UniqueName: \"kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.231715 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.232304 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.232512 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.232876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.237215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.238244 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.249567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwff2\" (UniqueName: \"kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2\") pod \"swift-ring-rebalance-debug-bjpc6\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.305355 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:02 crc kubenswrapper[4831]: I0309 16:41:02.568160 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6"] Mar 09 16:41:02 crc kubenswrapper[4831]: W0309 16:41:02.574052 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e94e7ff_7a51_4c87_bc16_b9b6cd77b3ea.slice/crio-8d0aeb5a1473db302691ada1c74b85a6fd2f3378f886c7de2ff91351482e0b80 WatchSource:0}: Error finding container 8d0aeb5a1473db302691ada1c74b85a6fd2f3378f886c7de2ff91351482e0b80: Status 404 returned error can't find the container with id 8d0aeb5a1473db302691ada1c74b85a6fd2f3378f886c7de2ff91351482e0b80 Mar 09 16:41:03 crc kubenswrapper[4831]: I0309 16:41:03.539170 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" event={"ID":"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea","Type":"ContainerStarted","Data":"b9f7a3b71a3c5f81135cf57998d3838a7d58493d8a41b4bbddaa457b6e262981"} Mar 09 16:41:03 crc kubenswrapper[4831]: I0309 16:41:03.539603 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" event={"ID":"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea","Type":"ContainerStarted","Data":"8d0aeb5a1473db302691ada1c74b85a6fd2f3378f886c7de2ff91351482e0b80"} Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.480008 4831 scope.go:117] "RemoveContainer" containerID="f928b482fbaf1519c924d3442f3129d823c5d3d08ed7bcea131370dc6ed6848c" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.510836 4831 scope.go:117] "RemoveContainer" containerID="e27105abccfd9a9b453ab5054fb08143b5cfdecb74f65f67940d1b4269481aa9" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.540759 4831 scope.go:117] "RemoveContainer" containerID="0ada448bbd3493a6a3ac42311856d0208188814374ba658943fd8763db316bcb" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.555359 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" containerID="b9f7a3b71a3c5f81135cf57998d3838a7d58493d8a41b4bbddaa457b6e262981" exitCode=0 Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.555441 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" event={"ID":"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea","Type":"ContainerDied","Data":"b9f7a3b71a3c5f81135cf57998d3838a7d58493d8a41b4bbddaa457b6e262981"} Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.599559 4831 scope.go:117] "RemoveContainer" containerID="15c8e214628985457e57ebcc27b33483f53bf932bceb8284c95b0605853fb2cd" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.621729 4831 scope.go:117] "RemoveContainer" containerID="dea59737a72a5cdd827cedd454cc822a00decbad58254f4e770b96a46b47c61e" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.655890 4831 scope.go:117] "RemoveContainer" containerID="97f154757044ced4e0a4c573755be3ef0e1f87b3961db4bfa6c1a42cf5a7d6dd" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.685552 4831 scope.go:117] "RemoveContainer" containerID="7c42301a71b85bb9ea00a327024cf90b1bbb280b876a6b2c466544d622608294" Mar 09 16:41:04 crc kubenswrapper[4831]: I0309 16:41:04.729037 4831 scope.go:117] "RemoveContainer" containerID="060da24d46167f525d6cf4a46a136ede24c510fc94ae7f26c9e3957f14998246" Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.872783 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.903890 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6"] Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.910791 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6"] Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990365 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990471 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990556 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwff2\" (UniqueName: \"kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990683 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.990740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf\") pod \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\" (UID: \"8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea\") " Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.991042 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.991197 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.991478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:05 crc kubenswrapper[4831]: I0309 16:41:05.996480 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2" (OuterVolumeSpecName: "kube-api-access-vwff2") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "kube-api-access-vwff2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.012731 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.023140 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts" (OuterVolumeSpecName: "scripts") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.026873 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" (UID: "8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.092901 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwff2\" (UniqueName: \"kubernetes.io/projected/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-kube-api-access-vwff2\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.092934 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.092945 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.092953 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.092962 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.578792 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0aeb5a1473db302691ada1c74b85a6fd2f3378f886c7de2ff91351482e0b80" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.578853 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjpc6" Mar 09 16:41:06 crc kubenswrapper[4831]: I0309 16:41:06.618113 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:41:06 crc kubenswrapper[4831]: E0309 16:41:06.618292 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.034115 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb"] Mar 09 16:41:07 crc kubenswrapper[4831]: E0309 16:41:07.035058 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" containerName="swift-ring-rebalance" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.035159 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" containerName="swift-ring-rebalance" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.035452 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" containerName="swift-ring-rebalance" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.036113 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.039553 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.039557 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.041672 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb"] Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.208882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.208967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znz72\" (UniqueName: \"kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.209099 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.209140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.209197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.209558 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znz72\" (UniqueName: \"kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.313879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.314295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.314584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.314664 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.319303 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.324907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.330280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znz72\" (UniqueName: \"kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72\") pod \"swift-ring-rebalance-debug-bj5pb\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.361522 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.628660 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea" path="/var/lib/kubelet/pods/8e94e7ff-7a51-4c87-bc16-b9b6cd77b3ea/volumes" Mar 09 16:41:07 crc kubenswrapper[4831]: I0309 16:41:07.647932 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb"] Mar 09 16:41:08 crc kubenswrapper[4831]: I0309 16:41:08.620017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" event={"ID":"d4f67cee-f75e-4126-b123-d0c4f0990ac7","Type":"ContainerStarted","Data":"420fb088f7b49d7095b6ed66d09d8315445a493d295c84ae73ef8a51c239ad95"} Mar 09 16:41:08 crc kubenswrapper[4831]: I0309 16:41:08.620734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" event={"ID":"d4f67cee-f75e-4126-b123-d0c4f0990ac7","Type":"ContainerStarted","Data":"1038a5d5366c9a14c7221dd0ce6ba96dc5864cae2399035770639432526bfecb"} Mar 09 16:41:08 crc kubenswrapper[4831]: I0309 16:41:08.641122 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" podStartSLOduration=1.64110479 podStartE2EDuration="1.64110479s" podCreationTimestamp="2026-03-09 16:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:08.639959087 +0000 UTC m=+2595.773641530" watchObservedRunningTime="2026-03-09 16:41:08.64110479 +0000 UTC m=+2595.774787223" Mar 09 16:41:09 crc kubenswrapper[4831]: I0309 16:41:09.629256 4831 generic.go:334] "Generic (PLEG): container finished" podID="d4f67cee-f75e-4126-b123-d0c4f0990ac7" containerID="420fb088f7b49d7095b6ed66d09d8315445a493d295c84ae73ef8a51c239ad95" exitCode=0 Mar 09 16:41:09 crc kubenswrapper[4831]: I0309 16:41:09.629299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" event={"ID":"d4f67cee-f75e-4126-b123-d0c4f0990ac7","Type":"ContainerDied","Data":"420fb088f7b49d7095b6ed66d09d8315445a493d295c84ae73ef8a51c239ad95"} Mar 09 16:41:10 crc kubenswrapper[4831]: I0309 16:41:10.915849 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:10 crc kubenswrapper[4831]: I0309 16:41:10.947289 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb"] Mar 09 16:41:10 crc kubenswrapper[4831]: I0309 16:41:10.955056 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb"] Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066342 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znz72\" (UniqueName: \"kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066724 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.066808 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices\") pod \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\" (UID: \"d4f67cee-f75e-4126-b123-d0c4f0990ac7\") " Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.067249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.067436 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.074690 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72" (OuterVolumeSpecName: "kube-api-access-znz72") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "kube-api-access-znz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.089545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts" (OuterVolumeSpecName: "scripts") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.096442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.100650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d4f67cee-f75e-4126-b123-d0c4f0990ac7" (UID: "d4f67cee-f75e-4126-b123-d0c4f0990ac7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168123 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168168 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168180 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4f67cee-f75e-4126-b123-d0c4f0990ac7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168189 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4f67cee-f75e-4126-b123-d0c4f0990ac7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168201 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znz72\" (UniqueName: \"kubernetes.io/projected/d4f67cee-f75e-4126-b123-d0c4f0990ac7-kube-api-access-znz72\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.168211 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4f67cee-f75e-4126-b123-d0c4f0990ac7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.627032 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f67cee-f75e-4126-b123-d0c4f0990ac7" path="/var/lib/kubelet/pods/d4f67cee-f75e-4126-b123-d0c4f0990ac7/volumes" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.649790 4831 scope.go:117] "RemoveContainer" containerID="420fb088f7b49d7095b6ed66d09d8315445a493d295c84ae73ef8a51c239ad95" Mar 09 16:41:11 crc kubenswrapper[4831]: I0309 16:41:11.649938 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5pb" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.087498 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-s27g9"] Mar 09 16:41:12 crc kubenswrapper[4831]: E0309 16:41:12.088288 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f67cee-f75e-4126-b123-d0c4f0990ac7" containerName="swift-ring-rebalance" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.088312 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f67cee-f75e-4126-b123-d0c4f0990ac7" containerName="swift-ring-rebalance" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.088718 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f67cee-f75e-4126-b123-d0c4f0990ac7" containerName="swift-ring-rebalance" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.089502 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.091992 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.093150 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.101068 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-s27g9"] Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284198 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5ggd\" (UniqueName: \"kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.284343 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.385937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386007 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5ggd\" (UniqueName: \"kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386092 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386134 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.386803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.387248 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.387384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.395029 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.395765 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.411506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5ggd\" (UniqueName: \"kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd\") pod \"swift-ring-rebalance-debug-s27g9\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.460326 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:12 crc kubenswrapper[4831]: I0309 16:41:12.724630 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-s27g9"] Mar 09 16:41:13 crc kubenswrapper[4831]: I0309 16:41:13.727968 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" event={"ID":"ca99aabe-1441-457c-a6cf-37e683abeb83","Type":"ContainerStarted","Data":"339ed4b6687c89305287281835ab24e886a3caa443bbd4e16191d08577d8df92"} Mar 09 16:41:13 crc kubenswrapper[4831]: I0309 16:41:13.728236 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" event={"ID":"ca99aabe-1441-457c-a6cf-37e683abeb83","Type":"ContainerStarted","Data":"ddd40db37850f2ec7b11a61cdabfda99219ca2cd820b7ddc76f4d8785dcb752c"} Mar 09 16:41:13 crc kubenswrapper[4831]: I0309 16:41:13.750295 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" podStartSLOduration=1.7502753709999999 podStartE2EDuration="1.750275371s" podCreationTimestamp="2026-03-09 16:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:13.744054814 +0000 UTC m=+2600.877737287" watchObservedRunningTime="2026-03-09 16:41:13.750275371 +0000 UTC m=+2600.883957794" Mar 09 16:41:14 crc kubenswrapper[4831]: I0309 16:41:14.739160 4831 generic.go:334] "Generic (PLEG): container finished" podID="ca99aabe-1441-457c-a6cf-37e683abeb83" containerID="339ed4b6687c89305287281835ab24e886a3caa443bbd4e16191d08577d8df92" exitCode=0 Mar 09 16:41:14 crc kubenswrapper[4831]: I0309 16:41:14.739217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" event={"ID":"ca99aabe-1441-457c-a6cf-37e683abeb83","Type":"ContainerDied","Data":"339ed4b6687c89305287281835ab24e886a3caa443bbd4e16191d08577d8df92"} Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.034215 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.068329 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-s27g9"] Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.079957 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-s27g9"] Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.144963 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.145100 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.145130 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.145223 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.145270 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5ggd\" (UniqueName: \"kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.145300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift\") pod \"ca99aabe-1441-457c-a6cf-37e683abeb83\" (UID: \"ca99aabe-1441-457c-a6cf-37e683abeb83\") " Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.146123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.146623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.149761 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd" (OuterVolumeSpecName: "kube-api-access-c5ggd") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "kube-api-access-c5ggd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.164219 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts" (OuterVolumeSpecName: "scripts") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.167587 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.169604 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ca99aabe-1441-457c-a6cf-37e683abeb83" (UID: "ca99aabe-1441-457c-a6cf-37e683abeb83"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247005 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247061 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247079 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca99aabe-1441-457c-a6cf-37e683abeb83-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247096 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca99aabe-1441-457c-a6cf-37e683abeb83-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247112 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5ggd\" (UniqueName: \"kubernetes.io/projected/ca99aabe-1441-457c-a6cf-37e683abeb83-kube-api-access-c5ggd\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.247130 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca99aabe-1441-457c-a6cf-37e683abeb83-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.756244 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd40db37850f2ec7b11a61cdabfda99219ca2cd820b7ddc76f4d8785dcb752c" Mar 09 16:41:16 crc kubenswrapper[4831]: I0309 16:41:16.756313 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-s27g9" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.218055 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq"] Mar 09 16:41:17 crc kubenswrapper[4831]: E0309 16:41:17.218335 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca99aabe-1441-457c-a6cf-37e683abeb83" containerName="swift-ring-rebalance" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.218346 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca99aabe-1441-457c-a6cf-37e683abeb83" containerName="swift-ring-rebalance" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.218560 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca99aabe-1441-457c-a6cf-37e683abeb83" containerName="swift-ring-rebalance" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.219061 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.221631 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.221893 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.230595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq"] Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgpj\" (UniqueName: \"kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367777 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.367892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgpj\" (UniqueName: \"kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469332 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.469893 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.470373 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.470511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.474370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.475264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.485309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgpj\" (UniqueName: \"kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj\") pod \"swift-ring-rebalance-debug-5xgnq\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.572910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.617309 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:41:17 crc kubenswrapper[4831]: E0309 16:41:17.617555 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:41:17 crc kubenswrapper[4831]: I0309 16:41:17.650241 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca99aabe-1441-457c-a6cf-37e683abeb83" path="/var/lib/kubelet/pods/ca99aabe-1441-457c-a6cf-37e683abeb83/volumes" Mar 09 16:41:18 crc kubenswrapper[4831]: I0309 16:41:18.035521 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq"] Mar 09 16:41:18 crc kubenswrapper[4831]: I0309 16:41:18.773381 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" event={"ID":"4a32f905-a47a-4d45-8d53-a1218ce2d01a","Type":"ContainerStarted","Data":"ef48b9c40ae399d11dbebd20c392ed53f4eab6ff2944d77da7d996eb8c83f005"} Mar 09 16:41:18 crc kubenswrapper[4831]: I0309 16:41:18.774638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" event={"ID":"4a32f905-a47a-4d45-8d53-a1218ce2d01a","Type":"ContainerStarted","Data":"2d28047a5ab339d6714e58c1ca9b912fe3cfe05b8b4069b43760b58fd4b37aa2"} Mar 09 16:41:18 crc kubenswrapper[4831]: I0309 16:41:18.797668 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" podStartSLOduration=1.79764687 podStartE2EDuration="1.79764687s" podCreationTimestamp="2026-03-09 16:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:18.791384162 +0000 UTC m=+2605.925066585" watchObservedRunningTime="2026-03-09 16:41:18.79764687 +0000 UTC m=+2605.931329283" Mar 09 16:41:19 crc kubenswrapper[4831]: I0309 16:41:19.783046 4831 generic.go:334] "Generic (PLEG): container finished" podID="4a32f905-a47a-4d45-8d53-a1218ce2d01a" containerID="ef48b9c40ae399d11dbebd20c392ed53f4eab6ff2944d77da7d996eb8c83f005" exitCode=0 Mar 09 16:41:19 crc kubenswrapper[4831]: I0309 16:41:19.783136 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" event={"ID":"4a32f905-a47a-4d45-8d53-a1218ce2d01a","Type":"ContainerDied","Data":"ef48b9c40ae399d11dbebd20c392ed53f4eab6ff2944d77da7d996eb8c83f005"} Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.065923 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.092794 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq"] Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.098279 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq"] Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224370 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224538 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfgpj\" (UniqueName: \"kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224737 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.224815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts\") pod \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\" (UID: \"4a32f905-a47a-4d45-8d53-a1218ce2d01a\") " Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.225518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.225663 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.225898 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a32f905-a47a-4d45-8d53-a1218ce2d01a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.225912 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.229379 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj" (OuterVolumeSpecName: "kube-api-access-tfgpj") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "kube-api-access-tfgpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.244816 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts" (OuterVolumeSpecName: "scripts") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.249449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.256997 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4a32f905-a47a-4d45-8d53-a1218ce2d01a" (UID: "4a32f905-a47a-4d45-8d53-a1218ce2d01a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.327148 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.327183 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a32f905-a47a-4d45-8d53-a1218ce2d01a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.327193 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfgpj\" (UniqueName: \"kubernetes.io/projected/4a32f905-a47a-4d45-8d53-a1218ce2d01a-kube-api-access-tfgpj\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.327202 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a32f905-a47a-4d45-8d53-a1218ce2d01a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.629273 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a32f905-a47a-4d45-8d53-a1218ce2d01a" path="/var/lib/kubelet/pods/4a32f905-a47a-4d45-8d53-a1218ce2d01a/volumes" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.798872 4831 scope.go:117] "RemoveContainer" containerID="ef48b9c40ae399d11dbebd20c392ed53f4eab6ff2944d77da7d996eb8c83f005" Mar 09 16:41:21 crc kubenswrapper[4831]: I0309 16:41:21.798900 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5xgnq" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.251797 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kxs89"] Mar 09 16:41:22 crc kubenswrapper[4831]: E0309 16:41:22.252117 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a32f905-a47a-4d45-8d53-a1218ce2d01a" containerName="swift-ring-rebalance" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.252129 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a32f905-a47a-4d45-8d53-a1218ce2d01a" containerName="swift-ring-rebalance" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.252260 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a32f905-a47a-4d45-8d53-a1218ce2d01a" containerName="swift-ring-rebalance" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.252800 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.254919 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.255939 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.264696 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kxs89"] Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.340665 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shkt\" (UniqueName: \"kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.340981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.341067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.341153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.341207 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.341326 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.442847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.442905 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.442936 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.442966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.443010 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.443057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shkt\" (UniqueName: \"kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.443713 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.443936 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.443966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.446623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.448630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.468622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shkt\" (UniqueName: \"kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt\") pod \"swift-ring-rebalance-debug-kxs89\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:22 crc kubenswrapper[4831]: I0309 16:41:22.577913 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:23 crc kubenswrapper[4831]: I0309 16:41:23.046454 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kxs89"] Mar 09 16:41:23 crc kubenswrapper[4831]: W0309 16:41:23.054320 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e9bdb0_38b5_4c5d_b8cc_0093e0375ef9.slice/crio-f74f09b96bae107c398efd7024063770f495b8794005c1216296fb0edeca26b6 WatchSource:0}: Error finding container f74f09b96bae107c398efd7024063770f495b8794005c1216296fb0edeca26b6: Status 404 returned error can't find the container with id f74f09b96bae107c398efd7024063770f495b8794005c1216296fb0edeca26b6 Mar 09 16:41:23 crc kubenswrapper[4831]: I0309 16:41:23.826594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" event={"ID":"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9","Type":"ContainerStarted","Data":"4ad76ad6a474a057227ec94b4cb89e318d6933c5744df0bdebb5872f3c2d56fa"} Mar 09 16:41:23 crc kubenswrapper[4831]: I0309 16:41:23.827046 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" event={"ID":"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9","Type":"ContainerStarted","Data":"f74f09b96bae107c398efd7024063770f495b8794005c1216296fb0edeca26b6"} Mar 09 16:41:23 crc kubenswrapper[4831]: I0309 16:41:23.875336 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" podStartSLOduration=1.875317983 podStartE2EDuration="1.875317983s" podCreationTimestamp="2026-03-09 16:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:23.867562311 +0000 UTC m=+2611.001244734" watchObservedRunningTime="2026-03-09 16:41:23.875317983 +0000 UTC m=+2611.009000406" Mar 09 16:41:24 crc kubenswrapper[4831]: I0309 16:41:24.839167 4831 generic.go:334] "Generic (PLEG): container finished" podID="f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" containerID="4ad76ad6a474a057227ec94b4cb89e318d6933c5744df0bdebb5872f3c2d56fa" exitCode=0 Mar 09 16:41:24 crc kubenswrapper[4831]: I0309 16:41:24.839243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" event={"ID":"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9","Type":"ContainerDied","Data":"4ad76ad6a474a057227ec94b4cb89e318d6933c5744df0bdebb5872f3c2d56fa"} Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.108557 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.156893 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kxs89"] Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.164760 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kxs89"] Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.194738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.194889 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.194938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.194967 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.195613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shkt\" (UniqueName: \"kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.195672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift\") pod \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\" (UID: \"f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9\") " Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.196086 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.196227 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.196917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.201600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt" (OuterVolumeSpecName: "kube-api-access-8shkt") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "kube-api-access-8shkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.219553 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts" (OuterVolumeSpecName: "scripts") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.222373 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.223733 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" (UID: "f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.297901 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.297943 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.297958 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shkt\" (UniqueName: \"kubernetes.io/projected/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-kube-api-access-8shkt\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.297972 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.297983 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.855919 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f74f09b96bae107c398efd7024063770f495b8794005c1216296fb0edeca26b6" Mar 09 16:41:26 crc kubenswrapper[4831]: I0309 16:41:26.855975 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kxs89" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.294789 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb"] Mar 09 16:41:27 crc kubenswrapper[4831]: E0309 16:41:27.295159 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" containerName="swift-ring-rebalance" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.295176 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" containerName="swift-ring-rebalance" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.295344 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" containerName="swift-ring-rebalance" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.295965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.298942 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.299114 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.302381 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb"] Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zrf\" (UniqueName: \"kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411801 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411827 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.411846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zrf\" (UniqueName: \"kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.513895 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.515455 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.515648 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.516530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.517887 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.520120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.532955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zrf\" (UniqueName: \"kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf\") pod \"swift-ring-rebalance-debug-zkzjb\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.628797 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9" path="/var/lib/kubelet/pods/f9e9bdb0-38b5-4c5d-b8cc-0093e0375ef9/volumes" Mar 09 16:41:27 crc kubenswrapper[4831]: I0309 16:41:27.653161 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:28 crc kubenswrapper[4831]: I0309 16:41:28.055129 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb"] Mar 09 16:41:28 crc kubenswrapper[4831]: W0309 16:41:28.063784 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2729c909_d4e4_4482_8a44_cf321ba0cce5.slice/crio-8ef25d1e5bf5afc507fe7552c6e3484d427a6f98a0eeb6a691c370dfa3b10d32 WatchSource:0}: Error finding container 8ef25d1e5bf5afc507fe7552c6e3484d427a6f98a0eeb6a691c370dfa3b10d32: Status 404 returned error can't find the container with id 8ef25d1e5bf5afc507fe7552c6e3484d427a6f98a0eeb6a691c370dfa3b10d32 Mar 09 16:41:28 crc kubenswrapper[4831]: I0309 16:41:28.878153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" event={"ID":"2729c909-d4e4-4482-8a44-cf321ba0cce5","Type":"ContainerStarted","Data":"7b94e28849fd73ebe65ccec64003170aeab69e09a6a827610b3abe4c3438c557"} Mar 09 16:41:28 crc kubenswrapper[4831]: I0309 16:41:28.878541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" event={"ID":"2729c909-d4e4-4482-8a44-cf321ba0cce5","Type":"ContainerStarted","Data":"8ef25d1e5bf5afc507fe7552c6e3484d427a6f98a0eeb6a691c370dfa3b10d32"} Mar 09 16:41:28 crc kubenswrapper[4831]: I0309 16:41:28.896182 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" podStartSLOduration=1.8961672250000001 podStartE2EDuration="1.896167225s" podCreationTimestamp="2026-03-09 16:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:28.891711128 +0000 UTC m=+2616.025393551" watchObservedRunningTime="2026-03-09 16:41:28.896167225 +0000 UTC m=+2616.029849648" Mar 09 16:41:29 crc kubenswrapper[4831]: I0309 16:41:29.890887 4831 generic.go:334] "Generic (PLEG): container finished" podID="2729c909-d4e4-4482-8a44-cf321ba0cce5" containerID="7b94e28849fd73ebe65ccec64003170aeab69e09a6a827610b3abe4c3438c557" exitCode=0 Mar 09 16:41:29 crc kubenswrapper[4831]: I0309 16:41:29.890952 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" event={"ID":"2729c909-d4e4-4482-8a44-cf321ba0cce5","Type":"ContainerDied","Data":"7b94e28849fd73ebe65ccec64003170aeab69e09a6a827610b3abe4c3438c557"} Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.213090 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.257644 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb"] Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.264845 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb"] Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373264 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zrf\" (UniqueName: \"kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373476 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.373648 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf\") pod \"2729c909-d4e4-4482-8a44-cf321ba0cce5\" (UID: \"2729c909-d4e4-4482-8a44-cf321ba0cce5\") " Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.374255 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.374343 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.385514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf" (OuterVolumeSpecName: "kube-api-access-j2zrf") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "kube-api-access-j2zrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.402549 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.406879 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.409440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts" (OuterVolumeSpecName: "scripts") pod "2729c909-d4e4-4482-8a44-cf321ba0cce5" (UID: "2729c909-d4e4-4482-8a44-cf321ba0cce5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475585 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475620 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475632 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zrf\" (UniqueName: \"kubernetes.io/projected/2729c909-d4e4-4482-8a44-cf321ba0cce5-kube-api-access-j2zrf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475641 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2729c909-d4e4-4482-8a44-cf321ba0cce5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475649 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2729c909-d4e4-4482-8a44-cf321ba0cce5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.475657 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2729c909-d4e4-4482-8a44-cf321ba0cce5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.625839 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2729c909-d4e4-4482-8a44-cf321ba0cce5" path="/var/lib/kubelet/pods/2729c909-d4e4-4482-8a44-cf321ba0cce5/volumes" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.912001 4831 scope.go:117] "RemoveContainer" containerID="7b94e28849fd73ebe65ccec64003170aeab69e09a6a827610b3abe4c3438c557" Mar 09 16:41:31 crc kubenswrapper[4831]: I0309 16:41:31.912140 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkzjb" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.414745 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf"] Mar 09 16:41:32 crc kubenswrapper[4831]: E0309 16:41:32.415303 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2729c909-d4e4-4482-8a44-cf321ba0cce5" containerName="swift-ring-rebalance" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.415327 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729c909-d4e4-4482-8a44-cf321ba0cce5" containerName="swift-ring-rebalance" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.415682 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2729c909-d4e4-4482-8a44-cf321ba0cce5" containerName="swift-ring-rebalance" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.416390 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.419171 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.419665 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.434582 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf"] Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490598 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8mm\" (UniqueName: \"kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.490762 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.592837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.592917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593032 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593090 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8mm\" (UniqueName: \"kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593164 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593804 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.593870 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.594333 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.597259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.597784 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.617526 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:41:32 crc kubenswrapper[4831]: E0309 16:41:32.617922 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.619929 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8mm\" (UniqueName: \"kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm\") pod \"swift-ring-rebalance-debug-7r6gf\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:32 crc kubenswrapper[4831]: I0309 16:41:32.792740 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:33 crc kubenswrapper[4831]: I0309 16:41:33.025940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf"] Mar 09 16:41:33 crc kubenswrapper[4831]: I0309 16:41:33.937719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" event={"ID":"ff9458e9-02a1-419b-82e8-3ef045b78971","Type":"ContainerStarted","Data":"f5c4e66f1d1683ef4594b2c86e38e7cae32fb46a91145b700fb1b3ddfefe6301"} Mar 09 16:41:33 crc kubenswrapper[4831]: I0309 16:41:33.938113 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" event={"ID":"ff9458e9-02a1-419b-82e8-3ef045b78971","Type":"ContainerStarted","Data":"ac1ee1c565b6c90b8db7d1ba08afcc2dd3faa4ef065262e1c941ab4bf210100e"} Mar 09 16:41:33 crc kubenswrapper[4831]: I0309 16:41:33.961314 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" podStartSLOduration=1.9612979400000001 podStartE2EDuration="1.96129794s" podCreationTimestamp="2026-03-09 16:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:33.958098959 +0000 UTC m=+2621.091781382" watchObservedRunningTime="2026-03-09 16:41:33.96129794 +0000 UTC m=+2621.094980363" Mar 09 16:41:34 crc kubenswrapper[4831]: I0309 16:41:34.951649 4831 generic.go:334] "Generic (PLEG): container finished" podID="ff9458e9-02a1-419b-82e8-3ef045b78971" containerID="f5c4e66f1d1683ef4594b2c86e38e7cae32fb46a91145b700fb1b3ddfefe6301" exitCode=0 Mar 09 16:41:34 crc kubenswrapper[4831]: I0309 16:41:34.951781 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" event={"ID":"ff9458e9-02a1-419b-82e8-3ef045b78971","Type":"ContainerDied","Data":"f5c4e66f1d1683ef4594b2c86e38e7cae32fb46a91145b700fb1b3ddfefe6301"} Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.230934 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.260787 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf"] Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.266875 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf"] Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.352269 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.352466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.352536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8mm\" (UniqueName: \"kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.352721 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.353249 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.353295 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.353380 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts\") pod \"ff9458e9-02a1-419b-82e8-3ef045b78971\" (UID: \"ff9458e9-02a1-419b-82e8-3ef045b78971\") " Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.353785 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.353979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.362996 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm" (OuterVolumeSpecName: "kube-api-access-6w8mm") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "kube-api-access-6w8mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.373737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.375438 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts" (OuterVolumeSpecName: "scripts") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.379561 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ff9458e9-02a1-419b-82e8-3ef045b78971" (UID: "ff9458e9-02a1-419b-82e8-3ef045b78971"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.455552 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.455582 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8mm\" (UniqueName: \"kubernetes.io/projected/ff9458e9-02a1-419b-82e8-3ef045b78971-kube-api-access-6w8mm\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.455593 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff9458e9-02a1-419b-82e8-3ef045b78971-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.455601 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff9458e9-02a1-419b-82e8-3ef045b78971-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.455609 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff9458e9-02a1-419b-82e8-3ef045b78971-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.973168 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac1ee1c565b6c90b8db7d1ba08afcc2dd3faa4ef065262e1c941ab4bf210100e" Mar 09 16:41:36 crc kubenswrapper[4831]: I0309 16:41:36.973235 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7r6gf" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.415980 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77jlz"] Mar 09 16:41:37 crc kubenswrapper[4831]: E0309 16:41:37.416628 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9458e9-02a1-419b-82e8-3ef045b78971" containerName="swift-ring-rebalance" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.416658 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9458e9-02a1-419b-82e8-3ef045b78971" containerName="swift-ring-rebalance" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.416941 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9458e9-02a1-419b-82e8-3ef045b78971" containerName="swift-ring-rebalance" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.418142 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.423786 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.425440 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.431849 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77jlz"] Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.573873 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.573937 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.574107 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.574306 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28p2\" (UniqueName: \"kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.574553 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.574617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.629490 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9458e9-02a1-419b-82e8-3ef045b78971" path="/var/lib/kubelet/pods/ff9458e9-02a1-419b-82e8-3ef045b78971/volumes" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v28p2\" (UniqueName: \"kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676666 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676715 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676790 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.676824 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.677361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.677517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.677596 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.681129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.681588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.697556 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28p2\" (UniqueName: \"kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2\") pod \"swift-ring-rebalance-debug-77jlz\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.738461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:37 crc kubenswrapper[4831]: I0309 16:41:37.970949 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77jlz"] Mar 09 16:41:37 crc kubenswrapper[4831]: W0309 16:41:37.973866 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod571ba335_538c_464f_8aa1_1d9eef272259.slice/crio-a3e47933bc01a1306bdd624bac73daa373c855ebb2a66caa57d973246987b499 WatchSource:0}: Error finding container a3e47933bc01a1306bdd624bac73daa373c855ebb2a66caa57d973246987b499: Status 404 returned error can't find the container with id a3e47933bc01a1306bdd624bac73daa373c855ebb2a66caa57d973246987b499 Mar 09 16:41:38 crc kubenswrapper[4831]: I0309 16:41:38.999460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" event={"ID":"571ba335-538c-464f-8aa1-1d9eef272259","Type":"ContainerStarted","Data":"fbf8896e0c02ad2aa758b4b6de8520184cc64c9e731f12cded317b959c8b94a1"} Mar 09 16:41:38 crc kubenswrapper[4831]: I0309 16:41:38.999735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" event={"ID":"571ba335-538c-464f-8aa1-1d9eef272259","Type":"ContainerStarted","Data":"a3e47933bc01a1306bdd624bac73daa373c855ebb2a66caa57d973246987b499"} Mar 09 16:41:39 crc kubenswrapper[4831]: I0309 16:41:39.020080 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" podStartSLOduration=2.020059345 podStartE2EDuration="2.020059345s" podCreationTimestamp="2026-03-09 16:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:39.014588839 +0000 UTC m=+2626.148271262" watchObservedRunningTime="2026-03-09 16:41:39.020059345 +0000 UTC m=+2626.153741778" Mar 09 16:41:40 crc kubenswrapper[4831]: I0309 16:41:40.012731 4831 generic.go:334] "Generic (PLEG): container finished" podID="571ba335-538c-464f-8aa1-1d9eef272259" containerID="fbf8896e0c02ad2aa758b4b6de8520184cc64c9e731f12cded317b959c8b94a1" exitCode=0 Mar 09 16:41:40 crc kubenswrapper[4831]: I0309 16:41:40.012784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" event={"ID":"571ba335-538c-464f-8aa1-1d9eef272259","Type":"ContainerDied","Data":"fbf8896e0c02ad2aa758b4b6de8520184cc64c9e731f12cded317b959c8b94a1"} Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.332306 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.359853 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77jlz"] Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.366939 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77jlz"] Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.436807 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.436895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.436957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.436989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.437082 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v28p2\" (UniqueName: \"kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.437128 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift\") pod \"571ba335-538c-464f-8aa1-1d9eef272259\" (UID: \"571ba335-538c-464f-8aa1-1d9eef272259\") " Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.437927 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.438194 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.443942 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2" (OuterVolumeSpecName: "kube-api-access-v28p2") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "kube-api-access-v28p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.459272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.465669 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.469729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts" (OuterVolumeSpecName: "scripts") pod "571ba335-538c-464f-8aa1-1d9eef272259" (UID: "571ba335-538c-464f-8aa1-1d9eef272259"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.539844 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.539926 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.539950 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/571ba335-538c-464f-8aa1-1d9eef272259-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.539973 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/571ba335-538c-464f-8aa1-1d9eef272259-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.540018 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v28p2\" (UniqueName: \"kubernetes.io/projected/571ba335-538c-464f-8aa1-1d9eef272259-kube-api-access-v28p2\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.540045 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/571ba335-538c-464f-8aa1-1d9eef272259-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:41 crc kubenswrapper[4831]: I0309 16:41:41.627730 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571ba335-538c-464f-8aa1-1d9eef272259" path="/var/lib/kubelet/pods/571ba335-538c-464f-8aa1-1d9eef272259/volumes" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.034869 4831 scope.go:117] "RemoveContainer" containerID="fbf8896e0c02ad2aa758b4b6de8520184cc64c9e731f12cded317b959c8b94a1" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.035113 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77jlz" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.551894 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-28sgp"] Mar 09 16:41:42 crc kubenswrapper[4831]: E0309 16:41:42.552266 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571ba335-538c-464f-8aa1-1d9eef272259" containerName="swift-ring-rebalance" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.552283 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="571ba335-538c-464f-8aa1-1d9eef272259" containerName="swift-ring-rebalance" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.552538 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="571ba335-538c-464f-8aa1-1d9eef272259" containerName="swift-ring-rebalance" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.553144 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.555693 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.557898 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.567934 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-28sgp"] Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.656608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.656932 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrtx\" (UniqueName: \"kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.657362 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.657504 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.657583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.657674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759317 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759543 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759747 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.759804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrtx\" (UniqueName: \"kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.760553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.760871 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.761238 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.767090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.768049 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.782511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrtx\" (UniqueName: \"kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx\") pod \"swift-ring-rebalance-debug-28sgp\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:42 crc kubenswrapper[4831]: I0309 16:41:42.892086 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:43 crc kubenswrapper[4831]: I0309 16:41:43.149513 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-28sgp"] Mar 09 16:41:44 crc kubenswrapper[4831]: I0309 16:41:44.062119 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" event={"ID":"900e517e-cd20-45d3-8057-133f642d6ab9","Type":"ContainerStarted","Data":"e03d0ff347cf538554374cfd546c072801b63ed6f248d36a7b28edbca3e21bf7"} Mar 09 16:41:44 crc kubenswrapper[4831]: I0309 16:41:44.062372 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" event={"ID":"900e517e-cd20-45d3-8057-133f642d6ab9","Type":"ContainerStarted","Data":"5508a3fff0250c995d73c9516961eb7ff3cbd0221fd55d0015ac92a5a614162d"} Mar 09 16:41:44 crc kubenswrapper[4831]: I0309 16:41:44.084601 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" podStartSLOduration=2.084586323 podStartE2EDuration="2.084586323s" podCreationTimestamp="2026-03-09 16:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:44.082818243 +0000 UTC m=+2631.216500676" watchObservedRunningTime="2026-03-09 16:41:44.084586323 +0000 UTC m=+2631.218268746" Mar 09 16:41:45 crc kubenswrapper[4831]: I0309 16:41:45.074334 4831 generic.go:334] "Generic (PLEG): container finished" podID="900e517e-cd20-45d3-8057-133f642d6ab9" containerID="e03d0ff347cf538554374cfd546c072801b63ed6f248d36a7b28edbca3e21bf7" exitCode=0 Mar 09 16:41:45 crc kubenswrapper[4831]: I0309 16:41:45.074456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" event={"ID":"900e517e-cd20-45d3-8057-133f642d6ab9","Type":"ContainerDied","Data":"e03d0ff347cf538554374cfd546c072801b63ed6f248d36a7b28edbca3e21bf7"} Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.444091 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.471319 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-28sgp"] Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.477154 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-28sgp"] Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519151 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrtx\" (UniqueName: \"kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519201 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519251 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519299 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519358 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.519504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts\") pod \"900e517e-cd20-45d3-8057-133f642d6ab9\" (UID: \"900e517e-cd20-45d3-8057-133f642d6ab9\") " Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.520711 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.521370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.525446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx" (OuterVolumeSpecName: "kube-api-access-vbrtx") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "kube-api-access-vbrtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.545550 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.545988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts" (OuterVolumeSpecName: "scripts") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.546275 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "900e517e-cd20-45d3-8057-133f642d6ab9" (UID: "900e517e-cd20-45d3-8057-133f642d6ab9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.620909 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrtx\" (UniqueName: \"kubernetes.io/projected/900e517e-cd20-45d3-8057-133f642d6ab9-kube-api-access-vbrtx\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.620970 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.620986 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/900e517e-cd20-45d3-8057-133f642d6ab9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.621000 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/900e517e-cd20-45d3-8057-133f642d6ab9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.621013 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:46 crc kubenswrapper[4831]: I0309 16:41:46.621028 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/900e517e-cd20-45d3-8057-133f642d6ab9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.105901 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5508a3fff0250c995d73c9516961eb7ff3cbd0221fd55d0015ac92a5a614162d" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.106025 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-28sgp" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.618193 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.629866 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900e517e-cd20-45d3-8057-133f642d6ab9" path="/var/lib/kubelet/pods/900e517e-cd20-45d3-8057-133f642d6ab9/volumes" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.655233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5"] Mar 09 16:41:47 crc kubenswrapper[4831]: E0309 16:41:47.655681 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900e517e-cd20-45d3-8057-133f642d6ab9" containerName="swift-ring-rebalance" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.655699 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="900e517e-cd20-45d3-8057-133f642d6ab9" containerName="swift-ring-rebalance" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.655851 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="900e517e-cd20-45d3-8057-133f642d6ab9" containerName="swift-ring-rebalance" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.656483 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.660138 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.661286 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.669841 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5"] Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740951 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.740997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgn8m\" (UniqueName: \"kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.841880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.841947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.842007 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.842047 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.842076 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgn8m\" (UniqueName: \"kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.842124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.842907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.843127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.843650 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.849101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.850116 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:47 crc kubenswrapper[4831]: I0309 16:41:47.863884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgn8m\" (UniqueName: \"kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m\") pod \"swift-ring-rebalance-debug-jlxl5\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:48 crc kubenswrapper[4831]: I0309 16:41:48.042685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:48 crc kubenswrapper[4831]: I0309 16:41:48.138299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04"} Mar 09 16:41:48 crc kubenswrapper[4831]: I0309 16:41:48.470430 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5"] Mar 09 16:41:48 crc kubenswrapper[4831]: W0309 16:41:48.480575 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0482d8_3b39_4442_9490_76a7e2c6ab13.slice/crio-f5b32904d98b25a057b011fbd211068907367b77f33711c72df9a26b60a0cf6a WatchSource:0}: Error finding container f5b32904d98b25a057b011fbd211068907367b77f33711c72df9a26b60a0cf6a: Status 404 returned error can't find the container with id f5b32904d98b25a057b011fbd211068907367b77f33711c72df9a26b60a0cf6a Mar 09 16:41:49 crc kubenswrapper[4831]: I0309 16:41:49.149259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" event={"ID":"aa0482d8-3b39-4442-9490-76a7e2c6ab13","Type":"ContainerStarted","Data":"aef1b3c8a5e667b7c0e9cb8c5a8374e040de6a6ee9e92326c3e6158085eaee1b"} Mar 09 16:41:49 crc kubenswrapper[4831]: I0309 16:41:49.149546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" event={"ID":"aa0482d8-3b39-4442-9490-76a7e2c6ab13","Type":"ContainerStarted","Data":"f5b32904d98b25a057b011fbd211068907367b77f33711c72df9a26b60a0cf6a"} Mar 09 16:41:49 crc kubenswrapper[4831]: I0309 16:41:49.164750 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" podStartSLOduration=2.164724876 podStartE2EDuration="2.164724876s" podCreationTimestamp="2026-03-09 16:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 16:41:49.161438882 +0000 UTC m=+2636.295121345" watchObservedRunningTime="2026-03-09 16:41:49.164724876 +0000 UTC m=+2636.298407339" Mar 09 16:41:50 crc kubenswrapper[4831]: I0309 16:41:50.167889 4831 generic.go:334] "Generic (PLEG): container finished" podID="aa0482d8-3b39-4442-9490-76a7e2c6ab13" containerID="aef1b3c8a5e667b7c0e9cb8c5a8374e040de6a6ee9e92326c3e6158085eaee1b" exitCode=0 Mar 09 16:41:50 crc kubenswrapper[4831]: I0309 16:41:50.167961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" event={"ID":"aa0482d8-3b39-4442-9490-76a7e2c6ab13","Type":"ContainerDied","Data":"aef1b3c8a5e667b7c0e9cb8c5a8374e040de6a6ee9e92326c3e6158085eaee1b"} Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.460312 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.495994 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5"] Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.500252 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5"] Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599105 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599167 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgn8m\" (UniqueName: \"kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599324 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.599409 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts\") pod \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\" (UID: \"aa0482d8-3b39-4442-9490-76a7e2c6ab13\") " Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.600570 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.601038 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.602879 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.606272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m" (OuterVolumeSpecName: "kube-api-access-wgn8m") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "kube-api-access-wgn8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.624918 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.633004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.633809 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts" (OuterVolumeSpecName: "scripts") pod "aa0482d8-3b39-4442-9490-76a7e2c6ab13" (UID: "aa0482d8-3b39-4442-9490-76a7e2c6ab13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.702413 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0482d8-3b39-4442-9490-76a7e2c6ab13-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.702443 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.702457 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa0482d8-3b39-4442-9490-76a7e2c6ab13-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.702469 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgn8m\" (UniqueName: \"kubernetes.io/projected/aa0482d8-3b39-4442-9490-76a7e2c6ab13-kube-api-access-wgn8m\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:51 crc kubenswrapper[4831]: I0309 16:41:51.702483 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa0482d8-3b39-4442-9490-76a7e2c6ab13-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 16:41:52 crc kubenswrapper[4831]: I0309 16:41:52.198063 4831 scope.go:117] "RemoveContainer" containerID="aef1b3c8a5e667b7c0e9cb8c5a8374e040de6a6ee9e92326c3e6158085eaee1b" Mar 09 16:41:52 crc kubenswrapper[4831]: I0309 16:41:52.198176 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jlxl5" Mar 09 16:41:53 crc kubenswrapper[4831]: I0309 16:41:53.630546 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0482d8-3b39-4442-9490-76a7e2c6ab13" path="/var/lib/kubelet/pods/aa0482d8-3b39-4442-9490-76a7e2c6ab13/volumes" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.152218 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551242-lb6t7"] Mar 09 16:42:00 crc kubenswrapper[4831]: E0309 16:42:00.153083 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0482d8-3b39-4442-9490-76a7e2c6ab13" containerName="swift-ring-rebalance" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.153098 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0482d8-3b39-4442-9490-76a7e2c6ab13" containerName="swift-ring-rebalance" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.153293 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0482d8-3b39-4442-9490-76a7e2c6ab13" containerName="swift-ring-rebalance" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.153856 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.156625 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.157000 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.157483 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.176498 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551242-lb6t7"] Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.246534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzj4q\" (UniqueName: \"kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q\") pod \"auto-csr-approver-29551242-lb6t7\" (UID: \"305f28c8-2d47-4a94-8913-abe62148b492\") " pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.348802 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzj4q\" (UniqueName: \"kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q\") pod \"auto-csr-approver-29551242-lb6t7\" (UID: \"305f28c8-2d47-4a94-8913-abe62148b492\") " pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.369406 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzj4q\" (UniqueName: \"kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q\") pod \"auto-csr-approver-29551242-lb6t7\" (UID: \"305f28c8-2d47-4a94-8913-abe62148b492\") " pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.475389 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.971845 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551242-lb6t7"] Mar 09 16:42:00 crc kubenswrapper[4831]: I0309 16:42:00.986553 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:42:01 crc kubenswrapper[4831]: I0309 16:42:01.286351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" event={"ID":"305f28c8-2d47-4a94-8913-abe62148b492","Type":"ContainerStarted","Data":"5694178bf9168f35a05bcdc741e25f96510dbb51f13117ec351098720a0f75d0"} Mar 09 16:42:02 crc kubenswrapper[4831]: I0309 16:42:02.298290 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" event={"ID":"305f28c8-2d47-4a94-8913-abe62148b492","Type":"ContainerStarted","Data":"434391eb2c19ceb3ee7c98a58a1d1f3aa6e27624da0c2413e3bfbc0d46ed0559"} Mar 09 16:42:02 crc kubenswrapper[4831]: I0309 16:42:02.318237 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" podStartSLOduration=1.29567338 podStartE2EDuration="2.318216051s" podCreationTimestamp="2026-03-09 16:42:00 +0000 UTC" firstStartedPulling="2026-03-09 16:42:00.986278861 +0000 UTC m=+2648.119961284" lastFinishedPulling="2026-03-09 16:42:02.008821532 +0000 UTC m=+2649.142503955" observedRunningTime="2026-03-09 16:42:02.312145248 +0000 UTC m=+2649.445827671" watchObservedRunningTime="2026-03-09 16:42:02.318216051 +0000 UTC m=+2649.451898494" Mar 09 16:42:03 crc kubenswrapper[4831]: I0309 16:42:03.321067 4831 generic.go:334] "Generic (PLEG): container finished" podID="305f28c8-2d47-4a94-8913-abe62148b492" containerID="434391eb2c19ceb3ee7c98a58a1d1f3aa6e27624da0c2413e3bfbc0d46ed0559" exitCode=0 Mar 09 16:42:03 crc kubenswrapper[4831]: I0309 16:42:03.321142 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" event={"ID":"305f28c8-2d47-4a94-8913-abe62148b492","Type":"ContainerDied","Data":"434391eb2c19ceb3ee7c98a58a1d1f3aa6e27624da0c2413e3bfbc0d46ed0559"} Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.666259 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.829549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzj4q\" (UniqueName: \"kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q\") pod \"305f28c8-2d47-4a94-8913-abe62148b492\" (UID: \"305f28c8-2d47-4a94-8913-abe62148b492\") " Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.835884 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q" (OuterVolumeSpecName: "kube-api-access-dzj4q") pod "305f28c8-2d47-4a94-8913-abe62148b492" (UID: "305f28c8-2d47-4a94-8913-abe62148b492"). InnerVolumeSpecName "kube-api-access-dzj4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.926259 4831 scope.go:117] "RemoveContainer" containerID="e5b63fb8a0daf0b9b267bc0f075dde3ca0f45befb92dff8d6d0380c9a16f1665" Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.931787 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzj4q\" (UniqueName: \"kubernetes.io/projected/305f28c8-2d47-4a94-8913-abe62148b492-kube-api-access-dzj4q\") on node \"crc\" DevicePath \"\"" Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.963130 4831 scope.go:117] "RemoveContainer" containerID="28df16619f57c2008bd9ad9eb4b57306183a7d6a545323bd15576430bf91eac9" Mar 09 16:42:04 crc kubenswrapper[4831]: I0309 16:42:04.999345 4831 scope.go:117] "RemoveContainer" containerID="21684a69419ec4141daf231dbae65205810af78772b059a2f08a6b1b29efd88c" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.045764 4831 scope.go:117] "RemoveContainer" containerID="6d81775386d2a78cb830e09f3729c8e3e57d5864452c77720cc872936e756149" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.068393 4831 scope.go:117] "RemoveContainer" containerID="0a22a67d5e3ef26de183d2eaf71f40976eb4f596ae122e72f6726a5f22e28f2d" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.101191 4831 scope.go:117] "RemoveContainer" containerID="95b0146eaa2253c3faa8461b3cd3bffc584d7a93938350d735aa2254e245fff9" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.127946 4831 scope.go:117] "RemoveContainer" containerID="2ddba3dbe33dffe31bfd0498d4aaa4c65a5dc5fa3aed2fa366447655ff3ff729" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.155757 4831 scope.go:117] "RemoveContainer" containerID="1b0b8fd2849e589c709d3cf40ccd7b8f89cf8f3745b0521db9878e3f513371ca" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.182026 4831 scope.go:117] "RemoveContainer" containerID="7543ef8bd08ef713307ca7dff175df34d884b79a59613ae619f0c92f0fe8dfd3" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.211110 4831 scope.go:117] "RemoveContainer" containerID="868e719094c890095004e9571bcad4931f5cb42ff0bc81974ff98394fb1862b4" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.369228 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.369274 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551242-lb6t7" event={"ID":"305f28c8-2d47-4a94-8913-abe62148b492","Type":"ContainerDied","Data":"5694178bf9168f35a05bcdc741e25f96510dbb51f13117ec351098720a0f75d0"} Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.369301 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5694178bf9168f35a05bcdc741e25f96510dbb51f13117ec351098720a0f75d0" Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.376806 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551236-6cqjf"] Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.383215 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551236-6cqjf"] Mar 09 16:42:05 crc kubenswrapper[4831]: I0309 16:42:05.630655 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d23ea67-6361-4fa7-b66a-3406731b8b5d" path="/var/lib/kubelet/pods/1d23ea67-6361-4fa7-b66a-3406731b8b5d/volumes" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.427554 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qdhl8/must-gather-p64fv"] Mar 09 16:42:23 crc kubenswrapper[4831]: E0309 16:42:23.428580 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305f28c8-2d47-4a94-8913-abe62148b492" containerName="oc" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.428598 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="305f28c8-2d47-4a94-8913-abe62148b492" containerName="oc" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.428810 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="305f28c8-2d47-4a94-8913-abe62148b492" containerName="oc" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.429733 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.431495 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qdhl8"/"openshift-service-ca.crt" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.432292 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qdhl8"/"kube-root-ca.crt" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.452602 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qdhl8/must-gather-p64fv"] Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.513014 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.513088 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b6tt\" (UniqueName: \"kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.614223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b6tt\" (UniqueName: \"kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.614340 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.614759 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.666821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b6tt\" (UniqueName: \"kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt\") pod \"must-gather-p64fv\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:23 crc kubenswrapper[4831]: I0309 16:42:23.749165 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:42:24 crc kubenswrapper[4831]: I0309 16:42:24.165518 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qdhl8/must-gather-p64fv"] Mar 09 16:42:24 crc kubenswrapper[4831]: I0309 16:42:24.545095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qdhl8/must-gather-p64fv" event={"ID":"ee9f6bb3-a55d-402e-873d-6c470c44deed","Type":"ContainerStarted","Data":"690f48e1b4b8fdd7ca608b5a601ec5acb0f1c1dbe592fea909ce38474d94ab32"} Mar 09 16:42:30 crc kubenswrapper[4831]: I0309 16:42:30.615969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qdhl8/must-gather-p64fv" event={"ID":"ee9f6bb3-a55d-402e-873d-6c470c44deed","Type":"ContainerStarted","Data":"2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2"} Mar 09 16:42:30 crc kubenswrapper[4831]: I0309 16:42:30.616826 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qdhl8/must-gather-p64fv" event={"ID":"ee9f6bb3-a55d-402e-873d-6c470c44deed","Type":"ContainerStarted","Data":"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84"} Mar 09 16:42:30 crc kubenswrapper[4831]: I0309 16:42:30.635231 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qdhl8/must-gather-p64fv" podStartSLOduration=1.7839735 podStartE2EDuration="7.635206986s" podCreationTimestamp="2026-03-09 16:42:23 +0000 UTC" firstStartedPulling="2026-03-09 16:42:24.174078274 +0000 UTC m=+2671.307760697" lastFinishedPulling="2026-03-09 16:42:30.02531176 +0000 UTC m=+2677.158994183" observedRunningTime="2026-03-09 16:42:30.630819301 +0000 UTC m=+2677.764501724" watchObservedRunningTime="2026-03-09 16:42:30.635206986 +0000 UTC m=+2677.768889409" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.551930 4831 scope.go:117] "RemoveContainer" containerID="df6d241fdb1cc30609100896af6197519ae6960704f499b1cd64dfe3c0f5d254" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.582181 4831 scope.go:117] "RemoveContainer" containerID="1eacd36498644e7f770140db1b15c6e7f262f3a8b54e0d8ba5974bf993d2a2c9" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.613465 4831 scope.go:117] "RemoveContainer" containerID="2ff015b92df50ca487a5d142510545241754fb9210ecbb6531a2fa3e1a16b0e5" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.642611 4831 scope.go:117] "RemoveContainer" containerID="ac1c0f5ccf6f36c3c274f744e8b1b005b318a35f212e51717d7dbe9a43af25be" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.683161 4831 scope.go:117] "RemoveContainer" containerID="6be60a4f53b65e5805149cb7281b7cdc37aecb3973b3e604398dfc9ef64783bc" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.711462 4831 scope.go:117] "RemoveContainer" containerID="f2fadcf8cac962c17d3b4775e23a348f950ff77c6ee60b8ddd91196a79fdf78d" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.753874 4831 scope.go:117] "RemoveContainer" containerID="5de2a78cd4a02d6f6f16c12e808e08684ea8582034c5b91977cf024017b45d62" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.775285 4831 scope.go:117] "RemoveContainer" containerID="09f1853d56c4d9bb0e109dd13097d4c4c8d2ef0d5f31a20039457d19447577fb" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.799458 4831 scope.go:117] "RemoveContainer" containerID="207960b74e8c50b5051ccc3f4f02d55e4a3ff4b989b9b9e018677ef18eddcc90" Mar 09 16:43:05 crc kubenswrapper[4831]: I0309 16:43:05.821297 4831 scope.go:117] "RemoveContainer" containerID="040fd1818ff8b62ebd3f5eca84edbfcca285d8ec196b648f17c676722773f7b8" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.479732 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/util/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.709256 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/pull/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.732006 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/pull/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.738195 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/util/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.908920 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/util/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.945533 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/pull/0.log" Mar 09 16:43:09 crc kubenswrapper[4831]: I0309 16:43:09.965741 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0a8h7kz_fcdc2307-f387-41ea-851e-2a7cb4fda4f8/extract/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.096217 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.242850 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.268241 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/pull/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.283481 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/pull/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.432962 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/pull/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.437684 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/extract/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.450713 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3bd7048dcc37701b02bcbff4abc2ad935d3fd7d4931f30216904b46ddesfqp9_139b863f-62d1-47a8-b4ec-e39d769d02ac/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.605673 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.776075 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.816977 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/pull/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.838596 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/pull/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.958448 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/util/0.log" Mar 09 16:43:10 crc kubenswrapper[4831]: I0309 16:43:10.959973 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.016898 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s7wtg_8a3b8608-bf80-4c51-a661-65b0c5056ecc/extract/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.145076 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/util/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.274202 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.279208 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.298328 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/util/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.441427 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/util/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.444713 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.516164 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40gk5cg_5349223e-9c5e-4621-b8c0-d7ee2e192d46/extract/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.622819 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/util/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.775569 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.775707 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/pull/0.log" Mar 09 16:43:11 crc kubenswrapper[4831]: I0309 16:43:11.785756 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/util/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.008918 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/pull/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.082322 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/util/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.083905 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6l2k58_57a06f9e-e898-4c05-a894-76fdcac7f633/extract/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.276759 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-index-6qh4f_16d21189-91cf-4f36-b6f3-96a240bd6167/registry-server/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.535042 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/util/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.766242 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/util/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.801955 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/pull/0.log" Mar 09 16:43:12 crc kubenswrapper[4831]: I0309 16:43:12.802124 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/pull/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.027771 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/util/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.089916 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/extract/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.101805 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c98ds7_7f3e2918-6395-4f53-8f31-e35c50beb83a/pull/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.294312 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fc6567686-lm7h4_aace6467-2dc4-43d8-ad52-740b182000dd/manager/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.457583 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-jf47m_f9b71116-cde3-4cb8-89c5-d24d4e379771/registry-server/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.550363 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-78fb4b8689-9qprv_f7e6dce0-2482-4e60-88f0-d53f94df4a68/manager/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.747322 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-sf6xv_70894f60-1ec0-4a31-a25d-3acc3e358869/registry-server/0.log" Mar 09 16:43:13 crc kubenswrapper[4831]: I0309 16:43:13.801283 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-84c5d77ccf-pvs69_04aafaf0-8914-454d-8c30-0d9655615704/manager/0.log" Mar 09 16:43:14 crc kubenswrapper[4831]: I0309 16:43:14.010603 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-ktplw_04284f29-23b1-41a2-b851-6391b29c4cb4/registry-server/0.log" Mar 09 16:43:14 crc kubenswrapper[4831]: I0309 16:43:14.041285 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-s5zp2_c4a0d3e8-756a-499f-bfd7-a720f83cbd6e/operator/0.log" Mar 09 16:43:14 crc kubenswrapper[4831]: I0309 16:43:14.212024 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-ltn84_da24a64e-1143-493e-a8ea-c944fabd209e/registry-server/0.log" Mar 09 16:43:14 crc kubenswrapper[4831]: I0309 16:43:14.334142 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6d856b55c6-8mbrf_a12c2319-1993-409d-ace0-5bc02371e54e/manager/0.log" Mar 09 16:43:14 crc kubenswrapper[4831]: I0309 16:43:14.502055 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-nq5bh_4dc14231-9873-414d-8a81-5c5b4857bde9/registry-server/0.log" Mar 09 16:43:15 crc kubenswrapper[4831]: I0309 16:43:15.047688 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64ff7758ff-l4vrw_eeb63698-ca7a-4598-a6b3-11e1fca09406/manager/0.log" Mar 09 16:43:28 crc kubenswrapper[4831]: I0309 16:43:28.231590 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h96p9_75ce18b8-5e44-47af-801c-97f9963d1786/control-plane-machine-set-operator/0.log" Mar 09 16:43:28 crc kubenswrapper[4831]: I0309 16:43:28.397794 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lg8th_314f92cb-af76-454e-b67c-f056477de5e9/kube-rbac-proxy/0.log" Mar 09 16:43:28 crc kubenswrapper[4831]: I0309 16:43:28.421170 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lg8th_314f92cb-af76-454e-b67c-f056477de5e9/machine-api-operator/0.log" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.665596 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.679353 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.693653 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.739083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.739441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwhm\" (UniqueName: \"kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.739465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.841249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spwhm\" (UniqueName: \"kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.841293 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.841790 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.841884 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.842144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:31 crc kubenswrapper[4831]: I0309 16:43:31.860809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwhm\" (UniqueName: \"kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm\") pod \"redhat-marketplace-8tnwg\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:32 crc kubenswrapper[4831]: I0309 16:43:32.012691 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:32 crc kubenswrapper[4831]: I0309 16:43:32.502932 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:33 crc kubenswrapper[4831]: I0309 16:43:33.138308 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerID="e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13" exitCode=0 Mar 09 16:43:33 crc kubenswrapper[4831]: I0309 16:43:33.138376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerDied","Data":"e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13"} Mar 09 16:43:33 crc kubenswrapper[4831]: I0309 16:43:33.138427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerStarted","Data":"d5697ffb0f05a5ad90919c943045b7e6c07d97fc75f1645250d5ebb59e310eea"} Mar 09 16:43:34 crc kubenswrapper[4831]: I0309 16:43:34.155885 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerStarted","Data":"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659"} Mar 09 16:43:35 crc kubenswrapper[4831]: I0309 16:43:35.167007 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerID="c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659" exitCode=0 Mar 09 16:43:35 crc kubenswrapper[4831]: I0309 16:43:35.167083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerDied","Data":"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659"} Mar 09 16:43:36 crc kubenswrapper[4831]: I0309 16:43:36.185566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerStarted","Data":"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b"} Mar 09 16:43:36 crc kubenswrapper[4831]: I0309 16:43:36.210849 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8tnwg" podStartSLOduration=2.76011933 podStartE2EDuration="5.210831354s" podCreationTimestamp="2026-03-09 16:43:31 +0000 UTC" firstStartedPulling="2026-03-09 16:43:33.140029103 +0000 UTC m=+2740.273711526" lastFinishedPulling="2026-03-09 16:43:35.590741127 +0000 UTC m=+2742.724423550" observedRunningTime="2026-03-09 16:43:36.204486623 +0000 UTC m=+2743.338169036" watchObservedRunningTime="2026-03-09 16:43:36.210831354 +0000 UTC m=+2743.344513767" Mar 09 16:43:42 crc kubenswrapper[4831]: I0309 16:43:42.013031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:42 crc kubenswrapper[4831]: I0309 16:43:42.013632 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:43 crc kubenswrapper[4831]: I0309 16:43:43.055804 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8tnwg" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="registry-server" probeResult="failure" output=< Mar 09 16:43:43 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Mar 09 16:43:43 crc kubenswrapper[4831]: > Mar 09 16:43:52 crc kubenswrapper[4831]: I0309 16:43:52.126359 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:52 crc kubenswrapper[4831]: I0309 16:43:52.176241 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:52 crc kubenswrapper[4831]: I0309 16:43:52.384535 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.329231 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8tnwg" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="registry-server" containerID="cri-o://b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b" gracePeriod=2 Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.779738 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.953095 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spwhm\" (UniqueName: \"kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm\") pod \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.953156 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities\") pod \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.953222 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content\") pod \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\" (UID: \"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05\") " Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.953704 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities" (OuterVolumeSpecName: "utilities") pod "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" (UID: "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.955085 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.964631 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm" (OuterVolumeSpecName: "kube-api-access-spwhm") pod "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" (UID: "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05"). InnerVolumeSpecName "kube-api-access-spwhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:43:53 crc kubenswrapper[4831]: I0309 16:43:53.990309 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" (UID: "6c72ef67-a77a-4fb2-8dc7-7cc653e68c05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.056463 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spwhm\" (UniqueName: \"kubernetes.io/projected/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-kube-api-access-spwhm\") on node \"crc\" DevicePath \"\"" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.056501 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.339145 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerID="b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b" exitCode=0 Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.339205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerDied","Data":"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b"} Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.339237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tnwg" event={"ID":"6c72ef67-a77a-4fb2-8dc7-7cc653e68c05","Type":"ContainerDied","Data":"d5697ffb0f05a5ad90919c943045b7e6c07d97fc75f1645250d5ebb59e310eea"} Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.339251 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tnwg" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.339283 4831 scope.go:117] "RemoveContainer" containerID="b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.361568 4831 scope.go:117] "RemoveContainer" containerID="c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.377064 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.383518 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tnwg"] Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.393441 4831 scope.go:117] "RemoveContainer" containerID="e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.413213 4831 scope.go:117] "RemoveContainer" containerID="b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b" Mar 09 16:43:54 crc kubenswrapper[4831]: E0309 16:43:54.413659 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b\": container with ID starting with b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b not found: ID does not exist" containerID="b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.413711 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b"} err="failed to get container status \"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b\": rpc error: code = NotFound desc = could not find container \"b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b\": container with ID starting with b73060053cdc8cbed637333b56925187ebefdb8887b64e883e8ba34a016c460b not found: ID does not exist" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.413739 4831 scope.go:117] "RemoveContainer" containerID="c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659" Mar 09 16:43:54 crc kubenswrapper[4831]: E0309 16:43:54.414151 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659\": container with ID starting with c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659 not found: ID does not exist" containerID="c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.414182 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659"} err="failed to get container status \"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659\": rpc error: code = NotFound desc = could not find container \"c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659\": container with ID starting with c1feeb764af346e392bbf8e993a0f1e4f201ebf1500fa9898852df02b6407659 not found: ID does not exist" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.414203 4831 scope.go:117] "RemoveContainer" containerID="e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13" Mar 09 16:43:54 crc kubenswrapper[4831]: E0309 16:43:54.414653 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13\": container with ID starting with e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13 not found: ID does not exist" containerID="e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13" Mar 09 16:43:54 crc kubenswrapper[4831]: I0309 16:43:54.414681 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13"} err="failed to get container status \"e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13\": rpc error: code = NotFound desc = could not find container \"e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13\": container with ID starting with e62ce6aa5603de7db50d2dfc39cad0e5507bbd34568a354944f45843de375c13 not found: ID does not exist" Mar 09 16:43:55 crc kubenswrapper[4831]: I0309 16:43:55.628550 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" path="/var/lib/kubelet/pods/6c72ef67-a77a-4fb2-8dc7-7cc653e68c05/volumes" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.545765 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lnrcv_68cf559a-a36c-4fae-9e8c-7130a85dd894/kube-rbac-proxy/0.log" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.566712 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lnrcv_68cf559a-a36c-4fae-9e8c-7130a85dd894/controller/0.log" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.728381 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-frr-files/0.log" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.951309 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-metrics/0.log" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.951311 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-frr-files/0.log" Mar 09 16:43:57 crc kubenswrapper[4831]: I0309 16:43:57.977701 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-reloader/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.009762 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-reloader/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.148294 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-metrics/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.149992 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-reloader/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.171283 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-metrics/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.173016 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-frr-files/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.374321 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-frr-files/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.408111 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-metrics/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.409285 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/cp-reloader/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.432299 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/controller/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.621923 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/kube-rbac-proxy/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.624254 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/kube-rbac-proxy-frr/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.627099 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/frr-metrics/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.828489 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/reloader/0.log" Mar 09 16:43:58 crc kubenswrapper[4831]: I0309 16:43:58.924031 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-pdcxr_12da408b-c4d6-4196-bb46-9e9c741b0819/frr-k8s-webhook-server/0.log" Mar 09 16:43:59 crc kubenswrapper[4831]: I0309 16:43:59.102553 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c78bb468c-hspmr_49b4858c-b806-4899-9332-1a23e118cf9e/manager/0.log" Mar 09 16:43:59 crc kubenswrapper[4831]: I0309 16:43:59.316004 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bb99c7556-ztbpl_3d412b43-90c3-4c5e-9967-22fabd2055b4/webhook-server/0.log" Mar 09 16:43:59 crc kubenswrapper[4831]: I0309 16:43:59.393506 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jm6qg_2ae44691-87ac-4a4e-83c9-8f11dce70777/kube-rbac-proxy/0.log" Mar 09 16:43:59 crc kubenswrapper[4831]: I0309 16:43:59.745904 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jm6qg_2ae44691-87ac-4a4e-83c9-8f11dce70777/speaker/0.log" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.136813 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551244-t88g9"] Mar 09 16:44:00 crc kubenswrapper[4831]: E0309 16:44:00.137110 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="extract-content" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.137123 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="extract-content" Mar 09 16:44:00 crc kubenswrapper[4831]: E0309 16:44:00.137143 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="extract-utilities" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.137150 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="extract-utilities" Mar 09 16:44:00 crc kubenswrapper[4831]: E0309 16:44:00.137165 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="registry-server" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.137171 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="registry-server" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.137293 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c72ef67-a77a-4fb2-8dc7-7cc653e68c05" containerName="registry-server" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.137772 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.145071 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-czs2j_dc627a54-0b63-4b4a-a9fc-db74921f2a63/frr/0.log" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.145073 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.145528 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.145536 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.160778 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551244-t88g9"] Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.240015 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmvr\" (UniqueName: \"kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr\") pod \"auto-csr-approver-29551244-t88g9\" (UID: \"49d1adef-dd6d-4a13-8adb-36459b7bfa46\") " pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.342090 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmvr\" (UniqueName: \"kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr\") pod \"auto-csr-approver-29551244-t88g9\" (UID: \"49d1adef-dd6d-4a13-8adb-36459b7bfa46\") " pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.368475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmvr\" (UniqueName: \"kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr\") pod \"auto-csr-approver-29551244-t88g9\" (UID: \"49d1adef-dd6d-4a13-8adb-36459b7bfa46\") " pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.462118 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:00 crc kubenswrapper[4831]: I0309 16:44:00.912737 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551244-t88g9"] Mar 09 16:44:01 crc kubenswrapper[4831]: I0309 16:44:01.394385 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551244-t88g9" event={"ID":"49d1adef-dd6d-4a13-8adb-36459b7bfa46","Type":"ContainerStarted","Data":"6fc0bb57aa78d06254f9ee1e6c8c65f8c7324c9d2b4a589aed18a6d01b6953d3"} Mar 09 16:44:02 crc kubenswrapper[4831]: I0309 16:44:02.403796 4831 generic.go:334] "Generic (PLEG): container finished" podID="49d1adef-dd6d-4a13-8adb-36459b7bfa46" containerID="7d61f4a936da9dd0c3e7cd9367e002875c82a83b365b64b2d71c24504ad10d1e" exitCode=0 Mar 09 16:44:02 crc kubenswrapper[4831]: I0309 16:44:02.403859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551244-t88g9" event={"ID":"49d1adef-dd6d-4a13-8adb-36459b7bfa46","Type":"ContainerDied","Data":"7d61f4a936da9dd0c3e7cd9367e002875c82a83b365b64b2d71c24504ad10d1e"} Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.018497 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.018824 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.680750 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.794185 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmvr\" (UniqueName: \"kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr\") pod \"49d1adef-dd6d-4a13-8adb-36459b7bfa46\" (UID: \"49d1adef-dd6d-4a13-8adb-36459b7bfa46\") " Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.799156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr" (OuterVolumeSpecName: "kube-api-access-sgmvr") pod "49d1adef-dd6d-4a13-8adb-36459b7bfa46" (UID: "49d1adef-dd6d-4a13-8adb-36459b7bfa46"). InnerVolumeSpecName "kube-api-access-sgmvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:44:03 crc kubenswrapper[4831]: I0309 16:44:03.895995 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgmvr\" (UniqueName: \"kubernetes.io/projected/49d1adef-dd6d-4a13-8adb-36459b7bfa46-kube-api-access-sgmvr\") on node \"crc\" DevicePath \"\"" Mar 09 16:44:04 crc kubenswrapper[4831]: I0309 16:44:04.419812 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551244-t88g9" event={"ID":"49d1adef-dd6d-4a13-8adb-36459b7bfa46","Type":"ContainerDied","Data":"6fc0bb57aa78d06254f9ee1e6c8c65f8c7324c9d2b4a589aed18a6d01b6953d3"} Mar 09 16:44:04 crc kubenswrapper[4831]: I0309 16:44:04.419860 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551244-t88g9" Mar 09 16:44:04 crc kubenswrapper[4831]: I0309 16:44:04.419863 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc0bb57aa78d06254f9ee1e6c8c65f8c7324c9d2b4a589aed18a6d01b6953d3" Mar 09 16:44:04 crc kubenswrapper[4831]: I0309 16:44:04.755894 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551238-8jjbk"] Mar 09 16:44:04 crc kubenswrapper[4831]: I0309 16:44:04.761557 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551238-8jjbk"] Mar 09 16:44:05 crc kubenswrapper[4831]: I0309 16:44:05.628436 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27a2831-75a5-4e0b-91f2-4320c114928b" path="/var/lib/kubelet/pods/f27a2831-75a5-4e0b-91f2-4320c114928b/volumes" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.002249 4831 scope.go:117] "RemoveContainer" containerID="c47111938c0344d6411cad0b21e657b51cc3160f9b044e63632db58785d9dd44" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.029440 4831 scope.go:117] "RemoveContainer" containerID="c1459bb063865602fde9243117afb48ea0c9849d0f17acc9809de9bc63f99e5b" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.064745 4831 scope.go:117] "RemoveContainer" containerID="3aa6b55d3123155335bb8b069017cf8ca4f65e9b70655a8cd396be081867c61a" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.088479 4831 scope.go:117] "RemoveContainer" containerID="ac8b006ae1625d209a9d16da32ed0c840a4fe96f726a777ba009bba858dc29f6" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.123551 4831 scope.go:117] "RemoveContainer" containerID="fc3fabf66216882a5aafded15bed68839b0b511c431c04eaf20cf8c41871b256" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.164110 4831 scope.go:117] "RemoveContainer" containerID="fe3f7da4548420787610f1f78236bd25a4553484f327e6ef5c0478073959081b" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.203847 4831 scope.go:117] "RemoveContainer" containerID="d2d58cdb4fc9a6d3600540da71b318ce81e478473c51ecdcd7b9c39a67d624ce" Mar 09 16:44:06 crc kubenswrapper[4831]: I0309 16:44:06.227785 4831 scope.go:117] "RemoveContainer" containerID="605adcaacd35d69fcc293b12923edb09fc03cb9ad5499ae2da8af83833c7dd70" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.297479 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-f95dc6db4-9zfcs_215f54de-45d6-469b-bfec-3a77a1e3d3d8/barbican-api/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.391049 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-f95dc6db4-9zfcs_215f54de-45d6-469b-bfec-3a77a1e3d3d8/barbican-api-log/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.471868 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-db-sync-mw2df_819181c8-0e88-4a97-b3ee-c6add60e4053/barbican-db-sync/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.564832 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-5cb45fbbd4-rg7r5_a67fdd93-c613-4b21-aed2-f6163d5405b3/barbican-keystone-listener/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.656369 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-5cb45fbbd4-rg7r5_a67fdd93-c613-4b21-aed2-f6163d5405b3/barbican-keystone-listener-log/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.760419 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-564b98bb67-5cv2p_54ed23b1-477a-40b1-830f-67a11831d1e8/barbican-worker-log/0.log" Mar 09 16:44:14 crc kubenswrapper[4831]: I0309 16:44:14.782055 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-564b98bb67-5cv2p_54ed23b1-477a-40b1-830f-67a11831d1e8/barbican-worker/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.301254 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665/mysql-bootstrap/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.408440 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_keystone-7f64cd86f9-mwgqm_a2fd146a-8317-4a78-b017-e74226b0888d/keystone-api/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.491106 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665/mysql-bootstrap/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.529818 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_1c3d4d6d-e4fe-4ad6-84d1-60132bc0d665/galera/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.753111 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_afb45109-97df-4bd1-80cc-f9374c213039/mysql-bootstrap/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.877705 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_afb45109-97df-4bd1-80cc-f9374c213039/mysql-bootstrap/0.log" Mar 09 16:44:15 crc kubenswrapper[4831]: I0309 16:44:15.997340 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_afb45109-97df-4bd1-80cc-f9374c213039/galera/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.077782 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_3b2d162d-7772-4a94-8a79-4f9664637cce/mysql-bootstrap/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.307796 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_3b2d162d-7772-4a94-8a79-4f9664637cce/mysql-bootstrap/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.404599 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_3b2d162d-7772-4a94-8a79-4f9664637cce/galera/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.657231 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_d24e185f-9a88-42b9-867e-2814f11c820e/setup-container/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.825931 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_d24e185f-9a88-42b9-867e-2814f11c820e/setup-container/0.log" Mar 09 16:44:16 crc kubenswrapper[4831]: I0309 16:44:16.859056 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_d24e185f-9a88-42b9-867e-2814f11c820e/rabbitmq/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.072061 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-zp7j5_4bcabb16-8d97-47d0-9e50-980536b98a36/proxy-httpd/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.110276 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-zp7j5_4bcabb16-8d97-47d0-9e50-980536b98a36/proxy-server/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.287892 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-sltr4_b4aa9c32-5e03-4e35-b545-2d6a820ebcb1/swift-ring-rebalance/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.303628 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_memcached-0_73fd1de5-1028-47e5-b204-e69c0f1cd028/memcached/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.381554 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/account-auditor/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.471163 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/account-reaper/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.498580 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/account-server/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.528192 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/account-replicator/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.549608 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/container-auditor/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.627595 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/container-replicator/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.657697 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/container-server/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.723227 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/object-auditor/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.725168 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/container-updater/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.835109 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/object-expirer/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.838010 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/object-replicator/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.914736 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/object-server/0.log" Mar 09 16:44:17 crc kubenswrapper[4831]: I0309 16:44:17.940245 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/object-updater/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.058366 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/swift-recon-cron/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.058420 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_065611bd-2987-4efb-b743-8045f7ec18fc/rsync/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.129751 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/account-auditor/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.276211 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/account-replicator/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.295553 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/account-server/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.313521 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/container-auditor/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.316488 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/account-reaper/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.471908 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/container-replicator/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.474098 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/container-updater/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.476457 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/container-server/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.489117 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/object-auditor/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.614274 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/object-expirer/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.653723 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/object-updater/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.656893 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/object-replicator/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.681088 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/object-server/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.690105 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/rsync/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.794862 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_594b7519-72cc-45b8-ab1a-2bcac1e8f514/swift-recon-cron/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.836196 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/account-auditor/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.863268 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/account-reaper/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.906916 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/account-replicator/0.log" Mar 09 16:44:18 crc kubenswrapper[4831]: I0309 16:44:18.965415 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/account-server/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.017683 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/container-auditor/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.044816 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/container-replicator/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.069456 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/container-server/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.082235 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/container-updater/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.152971 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/object-auditor/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.187232 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/object-expirer/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.236920 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/object-replicator/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.253689 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/object-server/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.302285 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/object-updater/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.331959 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/rsync/0.log" Mar 09 16:44:19 crc kubenswrapper[4831]: I0309 16:44:19.384060 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_0232625e-8378-498f-9689-2ba54adc50ed/swift-recon-cron/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.488568 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-utilities/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.759165 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-content/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.767838 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-content/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.785627 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-utilities/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.924931 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-utilities/0.log" Mar 09 16:44:31 crc kubenswrapper[4831]: I0309 16:44:31.943803 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/extract-content/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.201439 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-utilities/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.407351 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-utilities/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.448467 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-content/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.486422 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-content/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.510596 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qntgd_806fda88-2f52-4fae-a835-3ffd3fd0e55e/registry-server/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.672651 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-content/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.705022 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/extract-utilities/0.log" Mar 09 16:44:32 crc kubenswrapper[4831]: I0309 16:44:32.930289 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/util/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.018457 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.018522 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.056319 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/util/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.067534 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/pull/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.166971 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/pull/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.168904 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx6mz_52031b53-29e3-4087-ac0a-db35877849bc/registry-server/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.380831 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/util/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.395289 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/pull/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.405490 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47g275_5892b2b0-41c8-4fbc-8a19-ca44a6a35ed0/extract/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.563571 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-br9lf_130188b4-2b02-462a-9b83-cf6930ed2ea0/marketplace-operator/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.590034 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-utilities/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.792113 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-content/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.799965 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-content/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.840200 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-utilities/0.log" Mar 09 16:44:33 crc kubenswrapper[4831]: I0309 16:44:33.981340 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-content/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.004849 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/extract-utilities/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.081513 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k56j6_19e8f5e0-340b-4e8d-8af0-ec018733fe09/registry-server/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.209179 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-utilities/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.393556 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-utilities/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.393558 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-content/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.421357 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-content/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.588718 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-utilities/0.log" Mar 09 16:44:34 crc kubenswrapper[4831]: I0309 16:44:34.597741 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/extract-content/0.log" Mar 09 16:44:35 crc kubenswrapper[4831]: I0309 16:44:35.030666 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2sr4l_bc0c6900-0238-4664-a91c-057629716456/registry-server/0.log" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.144677 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt"] Mar 09 16:45:00 crc kubenswrapper[4831]: E0309 16:45:00.145560 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d1adef-dd6d-4a13-8adb-36459b7bfa46" containerName="oc" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.145576 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d1adef-dd6d-4a13-8adb-36459b7bfa46" containerName="oc" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.145754 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d1adef-dd6d-4a13-8adb-36459b7bfa46" containerName="oc" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.146356 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.148863 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.149152 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.155961 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt"] Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.244189 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.244268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.244368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c7p8\" (UniqueName: \"kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.345472 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.345520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c7p8\" (UniqueName: \"kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.345611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.346503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.352096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.361538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c7p8\" (UniqueName: \"kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8\") pod \"collect-profiles-29551245-rrgxt\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.468240 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:00 crc kubenswrapper[4831]: I0309 16:45:00.885121 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt"] Mar 09 16:45:01 crc kubenswrapper[4831]: I0309 16:45:01.215521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" event={"ID":"870cf24f-0044-4816-9ca4-ff0a48afc593","Type":"ContainerStarted","Data":"b4a8ccb60c011c6fc7f76db797cba01aa9bc516848b522f524d6f6c51a3c0b66"} Mar 09 16:45:01 crc kubenswrapper[4831]: I0309 16:45:01.215578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" event={"ID":"870cf24f-0044-4816-9ca4-ff0a48afc593","Type":"ContainerStarted","Data":"9c10013794a0d513bcd8ad8fd33a14ea43315d3c39bbba8b7db99a05d027f63a"} Mar 09 16:45:02 crc kubenswrapper[4831]: I0309 16:45:02.237410 4831 generic.go:334] "Generic (PLEG): container finished" podID="870cf24f-0044-4816-9ca4-ff0a48afc593" containerID="b4a8ccb60c011c6fc7f76db797cba01aa9bc516848b522f524d6f6c51a3c0b66" exitCode=0 Mar 09 16:45:02 crc kubenswrapper[4831]: I0309 16:45:02.238176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" event={"ID":"870cf24f-0044-4816-9ca4-ff0a48afc593","Type":"ContainerDied","Data":"b4a8ccb60c011c6fc7f76db797cba01aa9bc516848b522f524d6f6c51a3c0b66"} Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.019517 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.020104 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.020180 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.021357 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.021461 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04" gracePeriod=600 Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.257815 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04" exitCode=0 Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.258134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04"} Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.258184 4831 scope.go:117] "RemoveContainer" containerID="8ed41fbca0b29a4553b284823a0a1d42c746a11eefd0bc86a6373f11c01c62c5" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.554229 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.714660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume\") pod \"870cf24f-0044-4816-9ca4-ff0a48afc593\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.714992 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume\") pod \"870cf24f-0044-4816-9ca4-ff0a48afc593\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.715053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c7p8\" (UniqueName: \"kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8\") pod \"870cf24f-0044-4816-9ca4-ff0a48afc593\" (UID: \"870cf24f-0044-4816-9ca4-ff0a48afc593\") " Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.716155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume" (OuterVolumeSpecName: "config-volume") pod "870cf24f-0044-4816-9ca4-ff0a48afc593" (UID: "870cf24f-0044-4816-9ca4-ff0a48afc593"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.728130 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "870cf24f-0044-4816-9ca4-ff0a48afc593" (UID: "870cf24f-0044-4816-9ca4-ff0a48afc593"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.728156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8" (OuterVolumeSpecName: "kube-api-access-5c7p8") pod "870cf24f-0044-4816-9ca4-ff0a48afc593" (UID: "870cf24f-0044-4816-9ca4-ff0a48afc593"). InnerVolumeSpecName "kube-api-access-5c7p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.816627 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870cf24f-0044-4816-9ca4-ff0a48afc593-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.816669 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870cf24f-0044-4816-9ca4-ff0a48afc593-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:03 crc kubenswrapper[4831]: I0309 16:45:03.816682 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c7p8\" (UniqueName: \"kubernetes.io/projected/870cf24f-0044-4816-9ca4-ff0a48afc593-kube-api-access-5c7p8\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.267158 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" event={"ID":"870cf24f-0044-4816-9ca4-ff0a48afc593","Type":"ContainerDied","Data":"9c10013794a0d513bcd8ad8fd33a14ea43315d3c39bbba8b7db99a05d027f63a"} Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.267204 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c10013794a0d513bcd8ad8fd33a14ea43315d3c39bbba8b7db99a05d027f63a" Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.267220 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551245-rrgxt" Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.269814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerStarted","Data":"d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98"} Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.323227 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6"] Mar 09 16:45:04 crc kubenswrapper[4831]: I0309 16:45:04.328602 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551200-5bml6"] Mar 09 16:45:05 crc kubenswrapper[4831]: I0309 16:45:05.627848 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c796eb3-c14d-48be-882d-5ae13e12918a" path="/var/lib/kubelet/pods/3c796eb3-c14d-48be-882d-5ae13e12918a/volumes" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.389999 4831 scope.go:117] "RemoveContainer" containerID="eb6125224acfdf25c904eb2c5d3f2dcf65c444ca9bbb9989a33ceca2b3a91bc6" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.414228 4831 scope.go:117] "RemoveContainer" containerID="e96f3dc1086af4891161b15dffb55a5ea50c533520232ba195d6b56bf2be2a69" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.447447 4831 scope.go:117] "RemoveContainer" containerID="5cc54073d291c5c55a932670d8944495a404afe39a6fffdcbc968b577dc8b497" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.484140 4831 scope.go:117] "RemoveContainer" containerID="706598ceb64784f66c6e7128738053a81edf97bc4e403492d89b0539d93b4594" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.541472 4831 scope.go:117] "RemoveContainer" containerID="84bf6e168e59d432418cf76097abc50a141947e970253ea3a5086be439f761cf" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.572544 4831 scope.go:117] "RemoveContainer" containerID="954657b9fc4336db0ea9cca1779f37d7196d227522516d476b11ad503f89d698" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.597112 4831 scope.go:117] "RemoveContainer" containerID="249ef961987b962ba9f3fae85bcca9a7590a4d1db9e57ce72496eab9b129f4dc" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.620307 4831 scope.go:117] "RemoveContainer" containerID="45184c129ef0615c5176a8858e9424891eccc2d8d8aad7ed075c73b1023275a3" Mar 09 16:45:06 crc kubenswrapper[4831]: I0309 16:45:06.640284 4831 scope.go:117] "RemoveContainer" containerID="0b5b68182e9990ddf4ff5add30c1f44f51b45bd71f821182bb07150e8ff8499c" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.709077 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:19 crc kubenswrapper[4831]: E0309 16:45:19.710199 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870cf24f-0044-4816-9ca4-ff0a48afc593" containerName="collect-profiles" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.710216 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="870cf24f-0044-4816-9ca4-ff0a48afc593" containerName="collect-profiles" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.710440 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="870cf24f-0044-4816-9ca4-ff0a48afc593" containerName="collect-profiles" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.711708 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.774705 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.859564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.859650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62pqd\" (UniqueName: \"kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.859992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.961795 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.961875 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.961916 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62pqd\" (UniqueName: \"kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.962556 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.962629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:19 crc kubenswrapper[4831]: I0309 16:45:19.980004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62pqd\" (UniqueName: \"kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd\") pod \"community-operators-l8g8n\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:20 crc kubenswrapper[4831]: I0309 16:45:20.032833 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:20 crc kubenswrapper[4831]: I0309 16:45:20.611829 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:21 crc kubenswrapper[4831]: I0309 16:45:21.429448 4831 generic.go:334] "Generic (PLEG): container finished" podID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerID="e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa" exitCode=0 Mar 09 16:45:21 crc kubenswrapper[4831]: I0309 16:45:21.429517 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerDied","Data":"e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa"} Mar 09 16:45:21 crc kubenswrapper[4831]: I0309 16:45:21.429960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerStarted","Data":"5cc127fbd10b7f65d855985a1258beb3407cb84a52cf881f70f198bd2f19dbed"} Mar 09 16:45:23 crc kubenswrapper[4831]: I0309 16:45:23.456884 4831 generic.go:334] "Generic (PLEG): container finished" podID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerID="8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed" exitCode=0 Mar 09 16:45:23 crc kubenswrapper[4831]: I0309 16:45:23.457053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerDied","Data":"8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed"} Mar 09 16:45:24 crc kubenswrapper[4831]: I0309 16:45:24.469348 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerStarted","Data":"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0"} Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.033297 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.033837 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.081938 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.103298 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l8g8n" podStartSLOduration=8.556123308 podStartE2EDuration="11.103264551s" podCreationTimestamp="2026-03-09 16:45:19 +0000 UTC" firstStartedPulling="2026-03-09 16:45:21.430864082 +0000 UTC m=+2848.564546505" lastFinishedPulling="2026-03-09 16:45:23.978005325 +0000 UTC m=+2851.111687748" observedRunningTime="2026-03-09 16:45:24.501502548 +0000 UTC m=+2851.635184971" watchObservedRunningTime="2026-03-09 16:45:30.103264551 +0000 UTC m=+2857.236946974" Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.566743 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:30 crc kubenswrapper[4831]: I0309 16:45:30.612148 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:32 crc kubenswrapper[4831]: I0309 16:45:32.529233 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l8g8n" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="registry-server" containerID="cri-o://3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0" gracePeriod=2 Mar 09 16:45:32 crc kubenswrapper[4831]: I0309 16:45:32.949840 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.102989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content\") pod \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.103032 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities\") pod \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.103126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62pqd\" (UniqueName: \"kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd\") pod \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\" (UID: \"1739b3c1-29a4-4edd-a2f6-795d79b2864d\") " Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.104180 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities" (OuterVolumeSpecName: "utilities") pod "1739b3c1-29a4-4edd-a2f6-795d79b2864d" (UID: "1739b3c1-29a4-4edd-a2f6-795d79b2864d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.109543 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd" (OuterVolumeSpecName: "kube-api-access-62pqd") pod "1739b3c1-29a4-4edd-a2f6-795d79b2864d" (UID: "1739b3c1-29a4-4edd-a2f6-795d79b2864d"). InnerVolumeSpecName "kube-api-access-62pqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.191707 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1739b3c1-29a4-4edd-a2f6-795d79b2864d" (UID: "1739b3c1-29a4-4edd-a2f6-795d79b2864d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.205114 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.205154 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1739b3c1-29a4-4edd-a2f6-795d79b2864d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.205168 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62pqd\" (UniqueName: \"kubernetes.io/projected/1739b3c1-29a4-4edd-a2f6-795d79b2864d-kube-api-access-62pqd\") on node \"crc\" DevicePath \"\"" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.537693 4831 generic.go:334] "Generic (PLEG): container finished" podID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerID="3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0" exitCode=0 Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.537782 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerDied","Data":"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0"} Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.537809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g8n" event={"ID":"1739b3c1-29a4-4edd-a2f6-795d79b2864d","Type":"ContainerDied","Data":"5cc127fbd10b7f65d855985a1258beb3407cb84a52cf881f70f198bd2f19dbed"} Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.537826 4831 scope.go:117] "RemoveContainer" containerID="3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.537970 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g8n" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.564892 4831 scope.go:117] "RemoveContainer" containerID="8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.574536 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.592765 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l8g8n"] Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.602986 4831 scope.go:117] "RemoveContainer" containerID="e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.628275 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" path="/var/lib/kubelet/pods/1739b3c1-29a4-4edd-a2f6-795d79b2864d/volumes" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.635269 4831 scope.go:117] "RemoveContainer" containerID="3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0" Mar 09 16:45:33 crc kubenswrapper[4831]: E0309 16:45:33.635754 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0\": container with ID starting with 3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0 not found: ID does not exist" containerID="3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.635795 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0"} err="failed to get container status \"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0\": rpc error: code = NotFound desc = could not find container \"3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0\": container with ID starting with 3dcbb84912f31c6ef1c14505fd74ea0f39a9794d511b5da53828210a3e726cc0 not found: ID does not exist" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.635819 4831 scope.go:117] "RemoveContainer" containerID="8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed" Mar 09 16:45:33 crc kubenswrapper[4831]: E0309 16:45:33.636119 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed\": container with ID starting with 8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed not found: ID does not exist" containerID="8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.636148 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed"} err="failed to get container status \"8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed\": rpc error: code = NotFound desc = could not find container \"8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed\": container with ID starting with 8a1f2dcfd5dd749041df3888fb807f79132ca9705b004f704d48b7b2381957ed not found: ID does not exist" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.636165 4831 scope.go:117] "RemoveContainer" containerID="e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa" Mar 09 16:45:33 crc kubenswrapper[4831]: E0309 16:45:33.636476 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa\": container with ID starting with e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa not found: ID does not exist" containerID="e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa" Mar 09 16:45:33 crc kubenswrapper[4831]: I0309 16:45:33.636501 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa"} err="failed to get container status \"e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa\": rpc error: code = NotFound desc = could not find container \"e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa\": container with ID starting with e5bb5223b4c1d390d1c455cb3161921381f20e510aec047fd518e59af21560fa not found: ID does not exist" Mar 09 16:45:56 crc kubenswrapper[4831]: I0309 16:45:56.877004 4831 generic.go:334] "Generic (PLEG): container finished" podID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerID="649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84" exitCode=0 Mar 09 16:45:56 crc kubenswrapper[4831]: I0309 16:45:56.877069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qdhl8/must-gather-p64fv" event={"ID":"ee9f6bb3-a55d-402e-873d-6c470c44deed","Type":"ContainerDied","Data":"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84"} Mar 09 16:45:56 crc kubenswrapper[4831]: I0309 16:45:56.879090 4831 scope.go:117] "RemoveContainer" containerID="649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84" Mar 09 16:45:57 crc kubenswrapper[4831]: I0309 16:45:57.036121 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qdhl8_must-gather-p64fv_ee9f6bb3-a55d-402e-873d-6c470c44deed/gather/0.log" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.141359 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551246-z9t27"] Mar 09 16:46:00 crc kubenswrapper[4831]: E0309 16:46:00.142072 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="extract-content" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.142089 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="extract-content" Mar 09 16:46:00 crc kubenswrapper[4831]: E0309 16:46:00.142114 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="extract-utilities" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.142122 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="extract-utilities" Mar 09 16:46:00 crc kubenswrapper[4831]: E0309 16:46:00.142147 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="registry-server" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.142169 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="registry-server" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.142380 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1739b3c1-29a4-4edd-a2f6-795d79b2864d" containerName="registry-server" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.143035 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.145453 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.145530 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.146127 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.149796 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551246-z9t27"] Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.263678 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jjl\" (UniqueName: \"kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl\") pod \"auto-csr-approver-29551246-z9t27\" (UID: \"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209\") " pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.366077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jjl\" (UniqueName: \"kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl\") pod \"auto-csr-approver-29551246-z9t27\" (UID: \"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209\") " pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.386557 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jjl\" (UniqueName: \"kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl\") pod \"auto-csr-approver-29551246-z9t27\" (UID: \"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209\") " pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.470482 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.729661 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551246-z9t27"] Mar 09 16:46:00 crc kubenswrapper[4831]: I0309 16:46:00.931590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551246-z9t27" event={"ID":"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209","Type":"ContainerStarted","Data":"9f45d0c85b6e6ad79d33450d69a72a7bc87b987dfc071faae6c3ec6f7f111045"} Mar 09 16:46:02 crc kubenswrapper[4831]: I0309 16:46:02.958862 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e5c5805-0e1c-4d3b-aac6-1c9e33e85209" containerID="44257b4f8c34f09dd4205461de28232604ef0b7347fcef1dace85f1262e7aa8a" exitCode=0 Mar 09 16:46:02 crc kubenswrapper[4831]: I0309 16:46:02.958950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551246-z9t27" event={"ID":"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209","Type":"ContainerDied","Data":"44257b4f8c34f09dd4205461de28232604ef0b7347fcef1dace85f1262e7aa8a"} Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.243467 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.275148 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qdhl8/must-gather-p64fv"] Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.275435 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qdhl8/must-gather-p64fv" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="copy" containerID="cri-o://2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2" gracePeriod=2 Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.284991 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qdhl8/must-gather-p64fv"] Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.342107 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jjl\" (UniqueName: \"kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl\") pod \"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209\" (UID: \"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209\") " Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.347181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl" (OuterVolumeSpecName: "kube-api-access-q2jjl") pod "4e5c5805-0e1c-4d3b-aac6-1c9e33e85209" (UID: "4e5c5805-0e1c-4d3b-aac6-1c9e33e85209"). InnerVolumeSpecName "kube-api-access-q2jjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.444152 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2jjl\" (UniqueName: \"kubernetes.io/projected/4e5c5805-0e1c-4d3b-aac6-1c9e33e85209-kube-api-access-q2jjl\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.622804 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qdhl8_must-gather-p64fv_ee9f6bb3-a55d-402e-873d-6c470c44deed/copy/0.log" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.623197 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.646621 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output\") pod \"ee9f6bb3-a55d-402e-873d-6c470c44deed\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.646680 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b6tt\" (UniqueName: \"kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt\") pod \"ee9f6bb3-a55d-402e-873d-6c470c44deed\" (UID: \"ee9f6bb3-a55d-402e-873d-6c470c44deed\") " Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.650652 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt" (OuterVolumeSpecName: "kube-api-access-9b6tt") pod "ee9f6bb3-a55d-402e-873d-6c470c44deed" (UID: "ee9f6bb3-a55d-402e-873d-6c470c44deed"). InnerVolumeSpecName "kube-api-access-9b6tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.736689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ee9f6bb3-a55d-402e-873d-6c470c44deed" (UID: "ee9f6bb3-a55d-402e-873d-6c470c44deed"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.748867 4831 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee9f6bb3-a55d-402e-873d-6c470c44deed-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.748907 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b6tt\" (UniqueName: \"kubernetes.io/projected/ee9f6bb3-a55d-402e-873d-6c470c44deed-kube-api-access-9b6tt\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.975675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551246-z9t27" event={"ID":"4e5c5805-0e1c-4d3b-aac6-1c9e33e85209","Type":"ContainerDied","Data":"9f45d0c85b6e6ad79d33450d69a72a7bc87b987dfc071faae6c3ec6f7f111045"} Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.975724 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f45d0c85b6e6ad79d33450d69a72a7bc87b987dfc071faae6c3ec6f7f111045" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.975755 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551246-z9t27" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.977460 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qdhl8_must-gather-p64fv_ee9f6bb3-a55d-402e-873d-6c470c44deed/copy/0.log" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.977933 4831 generic.go:334] "Generic (PLEG): container finished" podID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerID="2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2" exitCode=143 Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.977971 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qdhl8/must-gather-p64fv" Mar 09 16:46:04 crc kubenswrapper[4831]: I0309 16:46:04.977984 4831 scope.go:117] "RemoveContainer" containerID="2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.004655 4831 scope.go:117] "RemoveContainer" containerID="649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.065301 4831 scope.go:117] "RemoveContainer" containerID="2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2" Mar 09 16:46:05 crc kubenswrapper[4831]: E0309 16:46:05.065953 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2\": container with ID starting with 2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2 not found: ID does not exist" containerID="2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.066003 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2"} err="failed to get container status \"2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2\": rpc error: code = NotFound desc = could not find container \"2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2\": container with ID starting with 2eef4845fda64c55347ba0ac438c0d1079d5f3ca3aeb9e5fa9a349c8508087e2 not found: ID does not exist" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.066031 4831 scope.go:117] "RemoveContainer" containerID="649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84" Mar 09 16:46:05 crc kubenswrapper[4831]: E0309 16:46:05.066421 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84\": container with ID starting with 649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84 not found: ID does not exist" containerID="649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.066478 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84"} err="failed to get container status \"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84\": rpc error: code = NotFound desc = could not find container \"649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84\": container with ID starting with 649344c213e76f9fd489d8d4faba6a462886d1651be420990beb54e659aa8d84 not found: ID does not exist" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.294907 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551240-s2zfz"] Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.300918 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551240-s2zfz"] Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.625907 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0ffa7f-eeb8-489b-a60a-b04b1d731453" path="/var/lib/kubelet/pods/8c0ffa7f-eeb8-489b-a60a-b04b1d731453/volumes" Mar 09 16:46:05 crc kubenswrapper[4831]: I0309 16:46:05.627117 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" path="/var/lib/kubelet/pods/ee9f6bb3-a55d-402e-873d-6c470c44deed/volumes" Mar 09 16:46:06 crc kubenswrapper[4831]: I0309 16:46:06.797706 4831 scope.go:117] "RemoveContainer" containerID="e01b8603f7f1d8069c7f35994516468940ecd043607350a51bdd5e83e6ecd534" Mar 09 16:46:06 crc kubenswrapper[4831]: I0309 16:46:06.836784 4831 scope.go:117] "RemoveContainer" containerID="8f0fba3b235b0546ef92b382d368ad82cd85e01858224f5d11c596606d6f1075" Mar 09 16:46:06 crc kubenswrapper[4831]: I0309 16:46:06.887350 4831 scope.go:117] "RemoveContainer" containerID="284632c86a0d7459fdc9909e8f4a9046e6f39a0d6ebd4666495255014b0f875a" Mar 09 16:46:06 crc kubenswrapper[4831]: I0309 16:46:06.907484 4831 scope.go:117] "RemoveContainer" containerID="b1d3a40cdf7488df7cef2675cbc40c911ec3aba1dec209921244b93184350dbc" Mar 09 16:46:06 crc kubenswrapper[4831]: I0309 16:46:06.960454 4831 scope.go:117] "RemoveContainer" containerID="882fa21516b5ae75d24ca2b43c779677678200693f5ed0cad8396f1062c7d08e" Mar 09 16:46:07 crc kubenswrapper[4831]: I0309 16:46:07.002915 4831 scope.go:117] "RemoveContainer" containerID="b5c20f460aaa00f0935c3fdcf1cda6848dea6e986b1db5cdb2b148b74841fbf2" Mar 09 16:46:07 crc kubenswrapper[4831]: I0309 16:46:07.027662 4831 scope.go:117] "RemoveContainer" containerID="f8f97e2dc9aab56a1f40274d59cd4bcb1e0c2300fb978718031debf4e5c9980f" Mar 09 16:46:07 crc kubenswrapper[4831]: I0309 16:46:07.056966 4831 scope.go:117] "RemoveContainer" containerID="865c6fcd4f171ef7e17c208c7a45f30164ffd3f76d1f925507b6b2e83f3d19e1" Mar 09 16:46:07 crc kubenswrapper[4831]: I0309 16:46:07.081785 4831 scope.go:117] "RemoveContainer" containerID="55171b979ae27e87bf5e360569881bb48fccf8a6c6d00c1c3d0703518916f752" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.206590 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:37 crc kubenswrapper[4831]: E0309 16:46:37.207490 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="gather" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207511 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="gather" Mar 09 16:46:37 crc kubenswrapper[4831]: E0309 16:46:37.207529 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5c5805-0e1c-4d3b-aac6-1c9e33e85209" containerName="oc" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207536 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5c5805-0e1c-4d3b-aac6-1c9e33e85209" containerName="oc" Mar 09 16:46:37 crc kubenswrapper[4831]: E0309 16:46:37.207562 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="copy" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207569 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="copy" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207738 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="gather" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207761 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5c5805-0e1c-4d3b-aac6-1c9e33e85209" containerName="oc" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.207786 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9f6bb3-a55d-402e-873d-6c470c44deed" containerName="copy" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.209025 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.214238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.245389 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.245506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knmh\" (UniqueName: \"kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.245563 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.346973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.347035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knmh\" (UniqueName: \"kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.347085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.347609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.347942 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.373780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knmh\" (UniqueName: \"kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh\") pod \"redhat-operators-jq4mw\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:37 crc kubenswrapper[4831]: I0309 16:46:37.543059 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:38 crc kubenswrapper[4831]: I0309 16:46:38.007120 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:38 crc kubenswrapper[4831]: I0309 16:46:38.275106 4831 generic.go:334] "Generic (PLEG): container finished" podID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerID="eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378" exitCode=0 Mar 09 16:46:38 crc kubenswrapper[4831]: I0309 16:46:38.275151 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerDied","Data":"eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378"} Mar 09 16:46:38 crc kubenswrapper[4831]: I0309 16:46:38.275174 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerStarted","Data":"870abf94fea9ba9bf286af6e0dfaf7a2775ae6866a52f687c45dab41f6cd955b"} Mar 09 16:46:39 crc kubenswrapper[4831]: I0309 16:46:39.284178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerStarted","Data":"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2"} Mar 09 16:46:40 crc kubenswrapper[4831]: I0309 16:46:40.293299 4831 generic.go:334] "Generic (PLEG): container finished" podID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerID="d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2" exitCode=0 Mar 09 16:46:40 crc kubenswrapper[4831]: I0309 16:46:40.293347 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerDied","Data":"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2"} Mar 09 16:46:41 crc kubenswrapper[4831]: I0309 16:46:41.304525 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerStarted","Data":"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb"} Mar 09 16:46:41 crc kubenswrapper[4831]: I0309 16:46:41.328680 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jq4mw" podStartSLOduration=1.8934039070000002 podStartE2EDuration="4.328659032s" podCreationTimestamp="2026-03-09 16:46:37 +0000 UTC" firstStartedPulling="2026-03-09 16:46:38.276582794 +0000 UTC m=+2925.410265227" lastFinishedPulling="2026-03-09 16:46:40.711837929 +0000 UTC m=+2927.845520352" observedRunningTime="2026-03-09 16:46:41.321623061 +0000 UTC m=+2928.455305494" watchObservedRunningTime="2026-03-09 16:46:41.328659032 +0000 UTC m=+2928.462341455" Mar 09 16:46:47 crc kubenswrapper[4831]: I0309 16:46:47.544085 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:47 crc kubenswrapper[4831]: I0309 16:46:47.545008 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:47 crc kubenswrapper[4831]: I0309 16:46:47.587971 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:48 crc kubenswrapper[4831]: I0309 16:46:48.415354 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:48 crc kubenswrapper[4831]: I0309 16:46:48.467750 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:50 crc kubenswrapper[4831]: I0309 16:46:50.381059 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jq4mw" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="registry-server" containerID="cri-o://28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb" gracePeriod=2 Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.023560 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.107682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content\") pod \"512b57c1-fe63-4d13-a53b-a376e2138de3\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.107756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2knmh\" (UniqueName: \"kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh\") pod \"512b57c1-fe63-4d13-a53b-a376e2138de3\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.107787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities\") pod \"512b57c1-fe63-4d13-a53b-a376e2138de3\" (UID: \"512b57c1-fe63-4d13-a53b-a376e2138de3\") " Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.108809 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities" (OuterVolumeSpecName: "utilities") pod "512b57c1-fe63-4d13-a53b-a376e2138de3" (UID: "512b57c1-fe63-4d13-a53b-a376e2138de3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.112915 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh" (OuterVolumeSpecName: "kube-api-access-2knmh") pod "512b57c1-fe63-4d13-a53b-a376e2138de3" (UID: "512b57c1-fe63-4d13-a53b-a376e2138de3"). InnerVolumeSpecName "kube-api-access-2knmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.210719 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2knmh\" (UniqueName: \"kubernetes.io/projected/512b57c1-fe63-4d13-a53b-a376e2138de3-kube-api-access-2knmh\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.210787 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.268460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "512b57c1-fe63-4d13-a53b-a376e2138de3" (UID: "512b57c1-fe63-4d13-a53b-a376e2138de3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.311960 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512b57c1-fe63-4d13-a53b-a376e2138de3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.398991 4831 generic.go:334] "Generic (PLEG): container finished" podID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerID="28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb" exitCode=0 Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.399084 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq4mw" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.399056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerDied","Data":"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb"} Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.399212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq4mw" event={"ID":"512b57c1-fe63-4d13-a53b-a376e2138de3","Type":"ContainerDied","Data":"870abf94fea9ba9bf286af6e0dfaf7a2775ae6866a52f687c45dab41f6cd955b"} Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.399237 4831 scope.go:117] "RemoveContainer" containerID="28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.437167 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.448844 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jq4mw"] Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.476494 4831 scope.go:117] "RemoveContainer" containerID="d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.500418 4831 scope.go:117] "RemoveContainer" containerID="eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.532723 4831 scope.go:117] "RemoveContainer" containerID="28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb" Mar 09 16:46:52 crc kubenswrapper[4831]: E0309 16:46:52.533379 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb\": container with ID starting with 28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb not found: ID does not exist" containerID="28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.533437 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb"} err="failed to get container status \"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb\": rpc error: code = NotFound desc = could not find container \"28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb\": container with ID starting with 28876097a9d4bbaaa2c14bc351a516aa3194269978955828308f45742da37ebb not found: ID does not exist" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.533467 4831 scope.go:117] "RemoveContainer" containerID="d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2" Mar 09 16:46:52 crc kubenswrapper[4831]: E0309 16:46:52.533919 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2\": container with ID starting with d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2 not found: ID does not exist" containerID="d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.533947 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2"} err="failed to get container status \"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2\": rpc error: code = NotFound desc = could not find container \"d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2\": container with ID starting with d412d63ecdbde4c7810e36b34262be894f459075cc2745a5a77fade3791967b2 not found: ID does not exist" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.533967 4831 scope.go:117] "RemoveContainer" containerID="eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378" Mar 09 16:46:52 crc kubenswrapper[4831]: E0309 16:46:52.534284 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378\": container with ID starting with eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378 not found: ID does not exist" containerID="eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378" Mar 09 16:46:52 crc kubenswrapper[4831]: I0309 16:46:52.534317 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378"} err="failed to get container status \"eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378\": rpc error: code = NotFound desc = could not find container \"eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378\": container with ID starting with eca63fdc8e8fb7c8553f0cbdce53bf4c4d467688a4447fcff23fd810db44d378 not found: ID does not exist" Mar 09 16:46:53 crc kubenswrapper[4831]: I0309 16:46:53.633013 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" path="/var/lib/kubelet/pods/512b57c1-fe63-4d13-a53b-a376e2138de3/volumes" Mar 09 16:47:03 crc kubenswrapper[4831]: I0309 16:47:03.019105 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:47:03 crc kubenswrapper[4831]: I0309 16:47:03.019729 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.287647 4831 scope.go:117] "RemoveContainer" containerID="e9b2b389a9bd17a4f97a29b6537d0db0331d171ab8a64fb380987b79320915a8" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.321557 4831 scope.go:117] "RemoveContainer" containerID="5ae8170330c8be2b1ef0c3df06a573932bf6a2598cb7c3978c28997d29e0ed07" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.355774 4831 scope.go:117] "RemoveContainer" containerID="b9f7a3b71a3c5f81135cf57998d3838a7d58493d8a41b4bbddaa457b6e262981" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.380767 4831 scope.go:117] "RemoveContainer" containerID="fc409beabce962debd7389315052a4e7870b27bf28e596e850b30ef5b22cad79" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.429345 4831 scope.go:117] "RemoveContainer" containerID="e5952d3a3a64edd3791066ece2f9d2331e4c0637feb331018b610a6febb050dc" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.455726 4831 scope.go:117] "RemoveContainer" containerID="4549385f2b07202779fb3d5fcb640fdcf26deddec63882783561bbdecafbc192" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.485433 4831 scope.go:117] "RemoveContainer" containerID="bcae82598dbbc530a18515b2e2c5961e10856de607ba63e57c569400c3077f91" Mar 09 16:47:07 crc kubenswrapper[4831]: I0309 16:47:07.507478 4831 scope.go:117] "RemoveContainer" containerID="a15f8f9e482edf0e49c9294fb3e0e39d799cdb93fc590cc73692e1409710ce8d" Mar 09 16:47:33 crc kubenswrapper[4831]: I0309 16:47:33.018613 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:47:33 crc kubenswrapper[4831]: I0309 16:47:33.019171 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.153897 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551248-bp2lz"] Mar 09 16:48:00 crc kubenswrapper[4831]: E0309 16:48:00.154935 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="extract-content" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.154951 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="extract-content" Mar 09 16:48:00 crc kubenswrapper[4831]: E0309 16:48:00.154963 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="extract-utilities" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.154971 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="extract-utilities" Mar 09 16:48:00 crc kubenswrapper[4831]: E0309 16:48:00.155006 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="registry-server" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.155012 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="registry-server" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.155169 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="512b57c1-fe63-4d13-a53b-a376e2138de3" containerName="registry-server" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.155811 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.161951 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.161982 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.162316 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5xfhq" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.163566 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551248-bp2lz"] Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.281108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkx9g\" (UniqueName: \"kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g\") pod \"auto-csr-approver-29551248-bp2lz\" (UID: \"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0\") " pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.383013 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkx9g\" (UniqueName: \"kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g\") pod \"auto-csr-approver-29551248-bp2lz\" (UID: \"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0\") " pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.409872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkx9g\" (UniqueName: \"kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g\") pod \"auto-csr-approver-29551248-bp2lz\" (UID: \"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0\") " pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.474586 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.917547 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551248-bp2lz"] Mar 09 16:48:00 crc kubenswrapper[4831]: I0309 16:48:00.921354 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 16:48:01 crc kubenswrapper[4831]: I0309 16:48:01.047377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" event={"ID":"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0","Type":"ContainerStarted","Data":"ac2e3180c9a82220e5f1ba76ce237a9b042a0908d6f08a7fd63e4ab2e4fb11f9"} Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.019363 4831 patch_prober.go:28] interesting pod/machine-config-daemon-4mvxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.020092 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.020137 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.020760 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98"} pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.020826 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerName="machine-config-daemon" containerID="cri-o://d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" gracePeriod=600 Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.063997 4831 generic.go:334] "Generic (PLEG): container finished" podID="46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0" containerID="0852ff4d7ce05e73704fc66a36d30f6703aff5f38e690cdc9b1299da2d8b5d96" exitCode=0 Mar 09 16:48:03 crc kubenswrapper[4831]: I0309 16:48:03.064056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" event={"ID":"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0","Type":"ContainerDied","Data":"0852ff4d7ce05e73704fc66a36d30f6703aff5f38e690cdc9b1299da2d8b5d96"} Mar 09 16:48:03 crc kubenswrapper[4831]: E0309 16:48:03.148144 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.075564 4831 generic.go:334] "Generic (PLEG): container finished" podID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" exitCode=0 Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.075632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" event={"ID":"a1a80160-b9c7-4ecc-99be-4438a7c6ad9c","Type":"ContainerDied","Data":"d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98"} Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.075688 4831 scope.go:117] "RemoveContainer" containerID="ae5cb9d1fa3196bef37c6cd6d3a50d7b49906c2370397743a70f5b00a61f5b04" Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.076420 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:48:04 crc kubenswrapper[4831]: E0309 16:48:04.076716 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.356962 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.542325 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkx9g\" (UniqueName: \"kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g\") pod \"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0\" (UID: \"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0\") " Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.548154 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g" (OuterVolumeSpecName: "kube-api-access-wkx9g") pod "46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0" (UID: "46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0"). InnerVolumeSpecName "kube-api-access-wkx9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 16:48:04 crc kubenswrapper[4831]: I0309 16:48:04.644952 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkx9g\" (UniqueName: \"kubernetes.io/projected/46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0-kube-api-access-wkx9g\") on node \"crc\" DevicePath \"\"" Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.085906 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.085889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551248-bp2lz" event={"ID":"46d29f0f-e42a-4b8f-81b5-c14ac8ebb7e0","Type":"ContainerDied","Data":"ac2e3180c9a82220e5f1ba76ce237a9b042a0908d6f08a7fd63e4ab2e4fb11f9"} Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.086301 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2e3180c9a82220e5f1ba76ce237a9b042a0908d6f08a7fd63e4ab2e4fb11f9" Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.426923 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551242-lb6t7"] Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.432933 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551242-lb6t7"] Mar 09 16:48:05 crc kubenswrapper[4831]: I0309 16:48:05.626794 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305f28c8-2d47-4a94-8913-abe62148b492" path="/var/lib/kubelet/pods/305f28c8-2d47-4a94-8913-abe62148b492/volumes" Mar 09 16:48:07 crc kubenswrapper[4831]: I0309 16:48:07.654981 4831 scope.go:117] "RemoveContainer" containerID="e03d0ff347cf538554374cfd546c072801b63ed6f248d36a7b28edbca3e21bf7" Mar 09 16:48:07 crc kubenswrapper[4831]: I0309 16:48:07.688433 4831 scope.go:117] "RemoveContainer" containerID="f5c4e66f1d1683ef4594b2c86e38e7cae32fb46a91145b700fb1b3ddfefe6301" Mar 09 16:48:07 crc kubenswrapper[4831]: I0309 16:48:07.724283 4831 scope.go:117] "RemoveContainer" containerID="434391eb2c19ceb3ee7c98a58a1d1f3aa6e27624da0c2413e3bfbc0d46ed0559" Mar 09 16:48:07 crc kubenswrapper[4831]: I0309 16:48:07.767525 4831 scope.go:117] "RemoveContainer" containerID="4ad76ad6a474a057227ec94b4cb89e318d6933c5744df0bdebb5872f3c2d56fa" Mar 09 16:48:07 crc kubenswrapper[4831]: I0309 16:48:07.791706 4831 scope.go:117] "RemoveContainer" containerID="339ed4b6687c89305287281835ab24e886a3caa443bbd4e16191d08577d8df92" Mar 09 16:48:17 crc kubenswrapper[4831]: I0309 16:48:17.618236 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:48:17 crc kubenswrapper[4831]: E0309 16:48:17.619020 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:48:29 crc kubenswrapper[4831]: I0309 16:48:29.617058 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:48:29 crc kubenswrapper[4831]: E0309 16:48:29.617801 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:48:44 crc kubenswrapper[4831]: I0309 16:48:44.617952 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:48:44 crc kubenswrapper[4831]: E0309 16:48:44.618772 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:48:56 crc kubenswrapper[4831]: I0309 16:48:56.618401 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:48:56 crc kubenswrapper[4831]: E0309 16:48:56.619180 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c" Mar 09 16:49:11 crc kubenswrapper[4831]: I0309 16:49:11.618305 4831 scope.go:117] "RemoveContainer" containerID="d85552937ab73f553026a56db2eea3d7ae33f2f1cb70756a78658ef358074f98" Mar 09 16:49:11 crc kubenswrapper[4831]: E0309 16:49:11.619160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4mvxc_openshift-machine-config-operator(a1a80160-b9c7-4ecc-99be-4438a7c6ad9c)\"" pod="openshift-machine-config-operator/machine-config-daemon-4mvxc" podUID="a1a80160-b9c7-4ecc-99be-4438a7c6ad9c"